84.1k views
1 vote
Did the war change the role of women in American society

1 Answer

2 votes
Women's work in WW1. During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. New jobs were also created as part of the war effort, for example in munitions factories.
User Analia
by
6.0k points