158k views
3 votes
How did World War I change the role of women in American society and the military?​

1 Answer

10 votes

Answer: I think it was the Role of Women in ww1

Tens of thousands of women joined The Women's Land Army to work the soil, fields, and orchards to free men for military service. Women took to the land gladly and brought in the harvest during the war years to supply food to the nation, the military, and our allies.During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. ... The high demand for weapons resulted in the munitions factories becoming the largest single employer of women during 1918. Hope this helps have a awesome night/day❤️✨

Step-by-step explanation:

User Gsone
by
3.5k points