I'm tired of Christian Fundamentalists calling the United States "a Christian nation." Although many of the Founding Fathers may have NOMINALLY belonged to Protestant sects, their words suggest that they were Deists or Unitarians and had no preference for Christianity, or intention of elevating it above other religions.
Anotherperspective.org, under the heading "Religion and the Founding Fathers" lists Thomas Jefferson, Thomas Paine and George Washington with quotes that express their generally negative attitudes about Christianity.
Under the heading of "Separation of Church and State?" About.com makes a reasonable case against the Fundamentalist argument that "the founders meant only that no sect of Christianity was to be elevated above another, but still meant our government to be Christian..."
The Quartz Hill School of Theology makes the following essential point:
Many well-meaning Christians argue that the United States was founded by Christian men on Christian principles. Although well-intentioned, such sentiment is unfounded. The men who lead the United States in its revolution against England, who wrote the Declaration of Independence and put together the Constitution were not Christians by any stretch of the imagination.The United States is unlike any other country in the world in that it was founded on freedoms: freedom of speech, freedom of the press, freedom of religion, and has stayed that way for over 200 years. It is a country that has welcomed people from all over the world:
Why do some Christians imagine these men are Christians? Besides a desperate desire that it should be so, in a selective examination of their writings, one can discover positive statements about God and/or Christianity. However, merely believing in God does not make a person a Christian.
Give me your poor, your huddled masses, yearning to be free...