330,814 views
23 votes
23 votes
How did world war l change women's roles in the united states

User Linnette
by
3.2k points

1 Answer

15 votes
15 votes
Because of WWI, More women joined the workforce. Not only did their employment opportunities increase, but they found a more expansive range of job opportunities such as teaching, sales, and textile factories. Some women also worked on the front lines as nurses on the battlefield. Hope this helps!
User Bgdnlp
by
3.2k points