110k views
2 votes
What social, economic, and political changes occurred in the U.S during WW1?

a) Social changes included increased patriotism and war-related propaganda.
b) Economic changes involved the growth of industries to support the war effort.
c) Political changes included the passage of the 19th Amendment granting women the right to vote.
d) All of the above.

User Tenstar
by
8.1k points

1 Answer

3 votes

Final answer:

During WWI, the U.S. saw social changes like increased patriotism, economic growth in industries, and political milestones such as women's suffrage. Workers gained rights, and the roles of women and African Americans in the workforce expanded significantly. These changes had lasting effects on American society.

Step-by-step explanation:

D) All of the above. During World War I, the United States experienced significant social, economic, and political changes. Social changes included heightened patriotism and increased war-related propaganda, affecting the national psyche and everyday life.

Economic changes were evident through the growth of industries vital to the war effort, helping the U.S. to develop as an industrial power. The war also catalyzed political changes, notably the passage of the 19th Amendment, which granted women's suffrage and marked a significant advance in gender equality. Moreover, the U.S. experienced organizational labor changes as workers gained the right to form unions and advocate for their rights, leading to Progressive reforms such as the eight-hour workday.

Women and African Americans found new opportunities as they filled positions vacated by men who went to war. Post-war America saw an attempt to return to 'normalcy', but the transformative effects of the war, such as the growing political awareness among workers and the societal role of women, continued to influence the country.

User Dragonborn
by
6.7k points