176k views
1 vote
How did women's role in society change after WW2?

User Roee Anuar
by
8.0k points

1 Answer

4 votes

Answer:

Women's roles continued to expand in the postwar era.

Step-by-step explanation:

Women who remained in the workplace were usually demoted. But after their selfless efforts during World War II, men could no longer claim superiority over women. Women had enjoyed and even thrived on a taste of financial and personal freedom—and many wanted more.

User JERC
by
7.6k points