31.5k views
5 votes
How did the war change the lives of women and African Americans as a result of their participation in WWI? a) Women were granted the right to vote in several countries. b) African Americans gained greater civil rights and recognition. c) There were no significant changes for women or African Americans. d) Women and African Americans were further marginalized.

1 Answer

1 vote

Final answer:

World War I opened up new economic and social opportunities for women and African Americans, leading to expanded political rights and greater awareness of and demands for civil rights.

Step-by-step explanation:

The participation of women and African Americans in World War I resulted in substantial changes in their societal roles and opportunities. Many women, for the first time, found themselves working outside the home in jobs previously held by White men. This shift, catalyzed by the demands of a wartime economy, opened the door for women's independence and eventually contributed to the expanded political rights for women - including the achievement of universal suffrage in some countries.

Similarly, African Americans took on roles previously denied to them, which offered new opportunities both economically and socially. The war experience, particularly for African American servicemen abroad, increased awareness and demands for greater equality and civil rights. However, these advancements did not come without challenges and resistance, and in the post-war era, there were expectations and pressures to return to the pre-war social order.

Learn more about Changes for Women and African-Americans during WW1

User Coat
by
8.0k points