136k views
3 votes
How did the end of world war II change women’s roles

User Tejay
by
4.0k points

2 Answers

4 votes

Answer:

C

Step-by-step explanation:

UwU

User Jfn
by
4.6k points
6 votes

Answer:

C

Step-by-step explanation:

As a mother and a teacher, it saddens me to think that I would have to send my son and husband to war back then. There was really no choice. Hitler thought we were crazy for allowing women to work in the war. Women were supposed to be wives and mothers and continue to have babies for their race....

But, it all changed for America after Pearl Harbor. As the men went to war, women had to do everything from taking care of the home to taking care of the finances and even the car. They were hired to work in defense plants and along came Rosie the Riveter! Women helped to make sure that the Allies would have all of the war materials that were needed to defeat the enemy.

User Harsh Pokharna
by
4.6k points