75.6k views
1 vote
World War II was one of the most significant events of the 1900s and one of the most important events in US history. Think about how much the United States changed between the Great Depression and the postwar era, when the country had become an economic powerhouse. How did World War II influence and change the identity of the United States throughout the 1900s and into the present? What are some positive and negative changes that occurred in the United States in the years after World War II?

User Kezia
by
8.0k points

1 Answer

5 votes
One positive effect is that we won the war, The negative is the great amount of unemployment. I hope I could help. Have a wonderful rest of the day.
User Cgold
by
8.6k points