19.8k views
5 votes
How did the greater participation of women in the workforce change their role in American society?

User Taufi
by
5.7k points

2 Answers

3 votes

Women could live on their own and enjoy greater independence.

User Yohannes Masterous
by
5.7k points
3 votes

Women role in American society:

Social and gender inequalities sanctioned women to be in the forefront of employment. However, women have since participated in labor market since World War II. They had to provide for their families since many men were soldiers participating in wars and this paved way for employment.

In addition, the rise of feminism movements also increased employment for women as many were able to go to school and attain qualifications that enabled them to become doctors, lawyers, etc. This also increase social mobility; gave women power to head families; and tilted gender inequalities scales a little bit.

User Dwhalen
by
6.7k points