Hello!
In the past, women were primarily expected to stay at home and teach and raise her children. Men, however, were expected to have jobs and work for his family.
When women began working, it changed America’s views on gender roles because the roll of women was changing. Instead of working at home, they began working true jobs in society. Today, it is entirely normal to see women working for companies, as individual workers, etc.
I hope this helps you!