164k views
0 votes
How might women entering the workforce have changed the America's views on traditional gender roles?

2 Answers

2 votes
women were not treated equal and were expected to stay at home and didn't have the same rights as men. America is more feminist than racist. so by getting a job they proved to be independent.
User SeyyedMojtaba
by
6.7k points
0 votes
Hello!

In the past, women were primarily expected to stay at home and teach and raise her children. Men, however, were expected to have jobs and work for his family.
When women began working, it changed America’s views on gender roles because the roll of women was changing. Instead of working at home, they began working true jobs in society. Today, it is entirely normal to see women working for companies, as individual workers, etc.

I hope this helps you!
User Michal Kaut
by
6.9k points