150k views
23 votes
World War II was one of the most significant events of the 1900s and one of the most important events in US history. Think about how much the United States changed between the Great Depression and the postwar era, when the country had become an economic powerhouse. How did World War II influence and change the identity of the United States throughout the 1900s and into the present? What are some positive and negative changes that occurred in the United States in the years after World War II​

1 Answer

0 votes

Answer:

America's involvement in World War II had a significant impact on the economy and workforce of the United States. The United States was still recovering from the impact of the Great Depression and the unemployment rate was hovering around 25%. Our involvement in the war soon changed that rate. American factories were retooled to produce goods to support the war effort and almost overnight the unemployment rate dropped to around 10%. As more men were sent away to fight, women were hired to take over their positions on the assembly lines. Before World War II, women had generally been discouraged from working outside the home. Now, they were being encouraged to take over jobs that had been traditionally considered 'men's work.'

User Fabian Nicollier
by
5.3k points