169k views
2 votes
What are some positive and negative changes that occurred in the United States in the years after World War II?

1 Answer

5 votes

The negative effects of war are, of course, death and destruction. Millions can die in big wars like WWII. Cities can be devastated, hurting a country’s economy. These things cause terrible human misery.

There are, however, also possible positive effects of war. War can correct terrible injustices.

User Lezlie
by
8.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.