101k views
4 votes
Is the United States a "Christian nation," as many politicians often describe it? Should it be? Justify

your answer

User Tronum
by
5.3k points

1 Answer

4 votes

Answer:Typically, what is at issue in the political culture war is not religion but which side is more ... "on the contrary, the United States in its essence has always been a Christian nation, and ... there would be much more than a ripple of public comment. ... answer to our question is that America is not a Christian nation, but misguided.

Step-by-step explanation:

User Andrewjamesbowen
by
5.2k points