34.9k views
1 vote
How did the role of women in the United States change during and after World War II?

User Dasher
by
7.2k points

1 Answer

6 votes
the role changed during world war two because women were allowed to have jobs since their husbands and male family was at war. originally the plan was for things to go back to the way they were. However, women started to speak out against the idea when the war ended, thusly letting them keep the jobs. later women got all the same rights as men.

I hope this helps.
User Riba
by
8.3k points