52.8k views
5 votes
A major change women experienced during the post world war 1 era was that they started

User Tony Park
by
8.6k points

2 Answers

3 votes

Answer:

They began to work outside the home

Step-by-step explanation:

Took the test

User John Polo
by
8.9k points
4 votes
Since the men were gone at war, women began to work as nurses, factories, and other work places.
User Yin Gang
by
7.2k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.