The short answer is yes.
The much longer answer is we are not and were not founded as a Christian Nation and our roots are not in religion. Our Founding Fathers were primarily Unitarians and Deists, they had a strong disdain for organized religion within the construct of government. Thomas Jefferson, the primary author of the Constitution wrote the Jefferson Bible, still in print today; a Bible following the life of Christ without the miracles.
There were two movements, one directly after the Civil War and the other after WWII to drive this nation closer to a Theocracy. Each resulted in small inroads, but the 1950's saw the most pronounced with the changing of our national motto on all our monetary instruments to "In God we Trust" and the insertion of "One Nation Under God" in our Pledge, change the Pledge to a prayer.
In the 1990's we began to see the next surge of the Christian Right, this time the most radical one with the Christian Coalition, led by Pat Robertson. This dramatic shift right is a change in ideology to Christian Reconstructionism, or Theocratic Dominionism. Many of the leading Republican Governors, Presidential Candidates and current federal and state legislators all belong to churches within this coalition and believe in Theocratic Domininion, that is that the United States of America should be a Christian Nation living by Old Testament Law.
They further believe, as was explained by one of the leading scholars of Christian Reconstruction, Rousas John Rushdoony, no person should be allowed to worship any other God but the Christian God and the first amendment was intended to protect the church from government only.