52.8k views
5 votes
A major change women experienced during the post world war 1 era was that they started

User Tony Park
by
8.1k points

2 Answers

3 votes

Answer:

They began to work outside the home

Step-by-step explanation:

Took the test

User John Polo
by
8.4k points
4 votes
Since the men were gone at war, women began to work as nurses, factories, and other work places.
User Yin Gang
by
6.7k points