175k views
0 votes
How did world war i change women’s roles in the united states?

User Thenetimp
by
6.9k points

2 Answers

1 vote
Women were greatly valued in their ability to work and jobs for women became easier to come by and pay was increased
User Martin Ueding
by
7.3k points
2 votes

World War I changed women’s roles in the United States since women replaced men in the workforce.

During WWI (1914-1918), large numbers of women were recruited into jobs that were carried out by men who had gone to fight in the war. Furthermore, new jobs were created as part of the war effort, for instance in munitions factories.

As a result of the increasing demand for weapons, the munitions factories turnt into the largest single employer of women during 1918.

User Mouser
by
6.4k points