234k views
4 votes
How did WWI lead to new roles for women in society? Explain.

1 Answer

5 votes

Answer:

World War I's impact on women's roles in society was immense. Women were conscripted to fill empty jobs left behind by the male servicemen, and as such, they were both idealized as symbols of the home front under attack and viewed with suspicion as their temporary freedom made them "open to moral decay."

Step-by-step explanation:

User JasonWang
by
3.5k points