It's a very good question. Obviously, as one of the religions brought over with our ancestors, it makes sense that it would have rooted early. But there was a definite rise in its importance over the years. For example, "Under God" and "In God We Trust" were both mottos that were adopted long after the founding of our country.
And, despite claims of the founders basing this country on Christian values, they are the architects of church/state separation. So, clearly, the obsession came later. I know that there was a rise to separate Americans from communists during the cold war. And there was also likely a rise during the world wars, since all one could do was pray for the survival of loved ones.
But, I honestly think that the debate has gotten more fierce in recent years because alternatives are more readily available. The U.S. has often been a pioneer of new sciences and technologies. Through our past we have rabidly sought answers in the name of progress. But it seems to have reached a tipping point where suddenly we're debating what should be in school text books. No longer is it about the pursuit of answers and facts, but rather teaching the controversy. This is spurred by a floundering education system that we can't seem to correct.
In other words, our country's pursuit of knowledge has led us away from a religion that is, for all intents and purposes, our 'default'. And those who are religious have two options. To either embrace the new information, and adapt their belief, or dig in their heels and fight back. It's not a pretty sight, but ultimately, that which adapts has the best chance of survival.