Final answer:
World War I opened up new economic and social opportunities for women and African Americans, leading to expanded political rights and greater awareness of and demands for civil rights.
Step-by-step explanation:
The participation of women and African Americans in World War I resulted in substantial changes in their societal roles and opportunities. Many women, for the first time, found themselves working outside the home in jobs previously held by White men. This shift, catalyzed by the demands of a wartime economy, opened the door for women's independence and eventually contributed to the expanded political rights for women - including the achievement of universal suffrage in some countries.
Similarly, African Americans took on roles previously denied to them, which offered new opportunities both economically and socially. The war experience, particularly for African American servicemen abroad, increased awareness and demands for greater equality and civil rights. However, these advancements did not come without challenges and resistance, and in the post-war era, there were expectations and pressures to return to the pre-war social order.
Learn more about Changes for Women and African-Americans during WW1