156k views
3 votes
How did World War II influence and change the identity of the United States throughout the 1900s and into the present? What are some positive and negative changes that occurred in the United States in the years after World War II?

2 Answers

5 votes

Answer:

During the great depression, there was a huge number of unemployed and poor people and FDR managed to solve that crisis with the new deal. As he was solving it, world war 2 began and the country had to take part in that war. The United States became the richest country after the war because its economy and industry and infrastructure was not harmed unlike in other countries worldwide so they exported a lot and the standard of living skyrocketed. The country went from a period of extreme poverty to a period of extreme wealth in less than 15 years. This strengthened the idea of the american dream but only for some people. Following the war, the middle class which was predominantly white started getting richer so the minorities started losing more and more power due to the fact that they weren't getting wealthier. Segregation became huge and African-American communities were heavily discriminated against. Also, women were expected to be pretty housewives since the men were earning more than enough for a good life for an entire family. This lasted all the way up to the sixties when the traditional notion of what was good in america became deconstruction.

User AbsoluteBeginner
by
7.1k points
3 votes
For the answer to the question above, the WW2 influenced the identity of the US by bringing us out of the Great Depression. A Positive change would be the bringing out of the Great Depression. A negative change would be the development of nuclear warfare and the cold war.
User Camdub
by
6.6k points