116,133 views
26 votes
26 votes
How did the world wars change womens role in American society?

User Nikhil Vidhani
by
3.4k points

1 Answer

18 votes
18 votes
Answer:
It created jobs for them
Step-by-step explanation:
The men had to go off to fight while the women were teachers and worked in factories
User Michael Innes
by
2.7k points