I often here that America is a "Christian Nation". Do you agree or disagree with this? And what, exactly, does that mean? Clearly, not nearly all Americans are Christians. Does this mean that a few of the first settlers in America came here for religious freedom? Does this mean that a number of the founding fathers would have called themselves Christians? Does it mean that more people in this country would currently apply the label "Christian" to themselves than would call themselves atheists or Muslims?
sort by best latest
America was founded on principles that survived religion, NOT principles that were born IN religion. The founding fathers understood that no single faith should reign supreme, that religion should be separate from government. THAT's what we are.
You can help the HubPages community highlight top quality content by ranking this answer up or down.
No.... "One nation, indivisible". People forget that part and focus too much on the "under god" part. Listen to the WHOLE pledge sometimes. It's quite illuminating and has nothing to do with religious ideology.
3 answers hidden due to negative feedback. Show
3 answers hidden due to negative feedback. Hide