What do you think when you hear someone refer to America as a "christian nation"?
To be christian means to live, behave and believe in Christ. Follow in his footsteps closely. I doubt that Americans do that.
In my opinion, America is no longer a "Christian nation" we have turned away from God and are now following man. There are many who claim they are Christians, but their actions say otherwise. I would estimate approximately 20% of America is Christians, aka Christ-followers, who live out what they believe.
I think they are wrong. America is no longer a Christian nation. We have strayed from the wish of our founding fathers.
Good as being a Catholic (Christian), I feel it's less of a 'burden' to bear when I know a 'first' class nation treats Christianity seriously.
As a Christian, I find myself praying regularly and adapting Christian principles in my life - if an entire nation is seen as Christian-related, less explanation and general acceptance of Christian morality becomes the norm.
What do I think? I think they are uneducated and missinformed. First the "founding fathers" were not the christians that people like to think they were. At the time of the founding of our country we were basically pagan when compared to the Christian European countries. People need to actually study history... and hearing it from your pastor does NOT make it even remotely true (same goes for news anchors and politicians)
America most definitely not a Christian nation so I would probably think delusional?
I think of our Calvinist heritage. I don't know if you've ever heard of Max Weber's "The Protestant Work Ethic and The Spirit of Capitalism." Anyway the theme of some of my writings are about how American is a Calvinist nation (and so, if you count Calvinism as a version of Christianity, then the United States is, indeed, a Christian nation); and I believe that capitalism is basically monetized Calvinism.
And as I see it, the Calvinism works in various and surprising ways, I think.
I think it is a contradiction in terms as America is founded by a nation of "christians" who claim to be true, pure, and always right about everything, yet stole this land from its natives and forced them to live on reservations, founded segregation and deceit all at the same time.
i think, lmao. sure there are christians there but it is no where near what christianity is suppose to be about. just the crime rate alone destroys the christian tag.
i'll say they are wrong.it's not because i don't believe Americans are no longer good christians but it has placed itself with the status of "free nation" so high that almost every god is served there.....I just see them as neutral in that aspect
I think people are grossly misinformed when they say the Founding Fathers were Christians, and wanted to establish a Christian nation. There are a few instances of quotes that would leave you to believe that it is Christian nation, but the Founding Fathers themselves tended to waffle on the theory, based on Election season.
However, some astounding circumstantial evidence does exist.
Many of these Founding Fathers were deists, which is not the same as modern day Christiandom or Christianity at the time. For example, Thomas Jefferson was so out of sync with the existing sects of Christianity that he practiced his own variant of it. He even wrote his own bible.
Secondly, there is a very clear legislative basis for the fact that this is not a Christian Nation. The first is of course our fundamental right to the separation of Church and State. Jefferson made some great quotes on this.
The second legislative evidence comes from the Treaty of Tripoli, Article 11. Wiki it. Keep in mind, that this was ratified in 1797 by many of the original members of Congress, and was signed into law by John Adams. You can look up to see who was on Congress at the time, but I think it serves to show that this treaty was ratified unanimously, and Adams did not decide to veto it.
Essentially Article 11 states, in no uncertain terms whatsoever, that America is not a Christian Nation, was never intended to be such.
by Prophecy Teacher 2 years ago
Is it reasonable to assume that 13 different Christian Chartered Colonies, would send to a convention in Philadelphia a group of men - to make a Constitution - that allowed their way of life to end? Is it further reasonable to believe that those Christians picked only Deists to go represent them?...
by MPChris 5 years ago
Do you think the Founding Fathers intended a Christian Nation?This question is in specific reference to two important pieces of legislative evidence. Both, the 1st Amendment and the Treaty of Tripoli (Article 11). Keep in mind that the Treaty was ratified by many of the original founding fathers,...
by Justin Aptaker 6 years ago
I America a "Christian Nation"? What does that term even mean?I often here that America is a "Christian Nation". Do you agree or disagree with this? And what, exactly, does that mean? Clearly, not nearly all Americans are Christians. Does this mean that a few of the first...
by cooldad 7 years ago
Were the founding fathers of this country Christians? I have always been under the impression that our country was founded by people who were escaping religious persecution. Why then, do so many people claim that this country was created under Christianity? I could be wrong, but I...
by paarsurrey 7 years ago
Hi friendsIs America a Christian nation?ThanksI am an Ahmadi peaceful Muslim
by Mick Menous 3 years ago
As far as this whole Separation of Church and State thing in the United States is concerned, I personally believe that it is necessary. For example, the words “Separation of Church and State” are not physically found in the US Constitution. Yet it does say that, quote:“No law shall be passed...
|HubPages Device ID|
|Login||This is necessary to sign in to the HubPages Service.|
|HubPages Google Analytics|
|HubPages Traffic Pixel|
|Google Hosted Libraries|
|Google AdSense Host API|
|Conversion Tracking Pixels|
|Author Google Analytics|
|Amazon Tracking Pixel|