131,935 views
41 votes
41 votes
How did the war help women gain more respect and rights?

User Log N
by
2.7k points

1 Answer

29 votes
29 votes

World War II greatly changed the role of women and what they believed they could accomplish. Before, women were only thought to be able to take care of the house and the kids, but when the war happened... this changed.

A lot of goods were needed during World War II. Vehicles, ammo, weapons, and medical supplies were needed, but since the men were off in the war, there was nobody to create them and work in factories. Many women took on this role and worked to create items for the war.

Women would also create "Victory Gardens." This was to help boost independence, patriotism, and morale for the war. Gardens would be planted, which also gave extra food for both the men in the service and the people back home. Women would also help ration the food, using cards to redeem how much of a certain food you were allowed to have that week.

Many women were working both on the field and off the field as doctors. A lot of them would have to do very gruesome things, like amputation and surgery. Because of these women though, a lot of men ended up living through the war and coming home to their families.

While the women went back to their traditional jobs in the household after World War II ended, the respect for them was immense afterward. A lot of women enjoyed their time in the factories, which pushed for them to want more rights and to be able to work. This started a huge movement that eventually caused women to gain more rights.

User JEY
by
3.2k points