78.9k views
4 votes
By the 1960s women in american society

1 Answer

7 votes

Answer:

In the 1960s, deep cultural changes were altering the role of women in American society. More females than ever were entering the paid workforce, and this increased the dissatisfaction among women regarding huge gender disparities in pay and advancement and sexual harassment at the workplace.

User Liruqi
by
5.0k points