83.1k views
3 votes
What changes took place in American life after the US entered WWII?

User Jane Doh
by
6.8k points

1 Answer

3 votes

Final answer:

After the US entered WWII, changes in American life included mobilization, rationing, changes in industries, and shifts in societal norms like women's roles and civil rights.


Step-by-step explanation:

After the United States entered World War II, significant changes took place in American life. One major change included the mobilization of the entire country for war. The government implemented rationing and price controls in order to conserve resources for the war effort. Industries shifted production towards manufacturing military equipment and supplies, which ultimately led to an increase in employment and economic growth. Additionally, the war had a profound impact on society, particularly in terms of women's roles and civil rights. Women took on jobs traditionally held by men, contributing to the rise of the working woman in American society. The war also challenged racial segregation and discrimination, as African-Americans and other minority groups served in the armed forces and demanded equal rights.


Learn more about changes in American life after US entered WWII

User Worbel
by
7.9k points