26.3k views
19 votes
How did the world wars change womens role in American society?

1 Answer

3 votes
Answer:
It created jobs for them
Step-by-step explanation:
The men had to go off to fight while the women were teachers and worked in factories
User Adivasile
by
7.5k points