233k views
23 votes
How did life change for women in the United States after World War I started? A) More women had children. B) Women lost the right to vote. C) Women began to earn more than men. D) More women got jobs outside the home.

User ThinkBig
by
4.7k points

1 Answer

4 votes

Answer:

D..................................

User Mearaj
by
4.6k points