133k views
2 votes
How did life change for women in the United States after World War I started?

User Joycee
by
6.9k points

2 Answers

2 votes
It changed the lives of women a lot. During the War, most men were drafted into the war, which opened up a ton of jobs for women. Women were able to now volunteer/work for the Marines, Army, and Navy and take jobs left behind by men. So, their lives changed because the War gave them jobs and other oppurtuinites.
User Geeky Singh
by
6.9k points
1 vote
because many men were sent off to war, women took the men's positions in the workplace and so after the war, women had more non-traditional roles at work 
User Victor Godoy
by
7.3k points