29.5k views
4 votes
1. Do you believe the United States is becoming more secularized or more

fundamentalist? Please explain your position.

User Simongking
by
7.6k points

1 Answer

4 votes

Answer: Secularized

Explanation: Based on what I've seen, I feel the United States is secularizing. This is due of my research on the subject. Christianity was the foundation of the United States. It was made up of Christians who had fled persecution. And practically everything was built on it (Christianity), but now... I don't think so. Americans are clearly abandoning religion because they believe there is no God, no heaven, no hell, and no afterlife. They believe that you should simply live your life as you see fit. This, I believe, is due to modernity. And it's not just Christianity; other religions are involved as well.

Just to be clear, what I'm saying doesn't apply to everyone; there are still people who practice religion or follow Christ. However, what I'm saying is that, in my opinion, the United States as a whole or majority is secularizing. I do believe there is still a chance that America as a whole may return to fundamentalism, but as of right now, that doesn't look to currently be a possibility.

This is merely my opinion, based on my own historical research, to be clear. Although not the best solution, this was the most straightforward I could come up with. I hope this was helpful.

User Inuyasha
by
9.1k points