Philosophy In Action All the ideas and discussions
5 votes Vote

Is the United States a Christian nation?

People often claim that the United States is "a Christian nation." What do people mean by that? Why does it matter? Is it true or not?

Anonymous , 02.04.2012, 20:03
Idea status: completed

Comments

ZombieApocalypse, 02.04.2012, 21:03
If it was a christian nation, or meant to be a christian nation, the founders would have spelled it out in the constitution and established a religion. Yes there are references to god in documents like the declaration of independence. That's just so they could get it through the kings thick skull.

I think I liked it better when the various christian sects were warring with each other. Now that they have set aside enough differences they can band together and call this a 'christian country', we have to deal with distractions like these.

Leave a comment