203k views
1 vote
Women had many new experiences as a result of world war i, including working at new jobs, wearing new fashions, and acting more independently. what other new change came to women just after wwi?

User KVK
by
8.1k points

1 Answer

6 votes
After World War I, women became increasingly part of mainstream life and eventually lead to the Women getting the right to vote.

Before the war, the daily lives of women were based on traditional roles of a mother, a house wife etc.

However, with the War, things changed dramatically eventually leading to women earning the right to vote in general elections.
User Kisna
by
7.7k points