156k views
2 votes
What happened to women's roles after ww1?

1 Answer

6 votes

Answer:

After the end of World War I, women's roles underwent a significant shift. Some of the changes that occurred include:

Return to traditional roles: Many women were expected to return to their traditional roles as homemakers and mothers after the war, as men returned to the workforce.

Loss of jobs: As men returned from the war, many women were laid off from their jobs or forced to leave to make way for the returning veterans.

Discrimination: Women faced discrimination in the workplace, as they were often paid less than men and faced limited opportunities for advancement.

Protests: Women who had gained new economic and social independence during the war, protested and organized to retain their rights and roles they had acquired during the war.

Changes in societal attitudes: Despite the return to traditional roles, the changes in societal attitudes towards women's capabilities and their right to work outside the home that had been brought about by the war remained.

Women's rights movements: The war had a profound impact on women's rights movements. Women's experience during the war had changed their perceptions of their roles and capabilities, and they began to demand more rights and opportunities for themselves.

Political participation: Women's political participation increased as a result of their work during the war, and many countries granted women the right to vote after the war.

Overall, while women's roles did shift back towards traditional roles after the end of World War I, the war had a lasting impact on women's rights and opportunities, laying the foundation for further progress in the decades to come.

Step-by-step explanation:

User Bennett Adams
by
6.2k points