173k views
1 vote
How did World War I change the lives of American women? It increased educational opportunities for women. It delayed the extension of voting rights to women. It made military service mandatory for young women. It broadened job opportunities for women.

1 Answer

3 votes

The correct answer is:

It broadened job opportunities for women.

Women started to take a big role in industrializing, the job opportunities were bigger because the majority of men was gone to fight the war, leaving the spots for women and african americans.

User Mark McClelland
by
4.8k points