404,968 views
36 votes
36 votes
URGENT

What happened to jobs for women after the end of World War I?


- Women surrendered their jobs to returning soldiers.

- Women sought advanced training to get professional jobs.

- Women were promoted to leadership positions.

- Women were able to get better paying jobs using skills learned while working during the war.

User Baris Atamer
by
2.7k points

1 Answer

16 votes
16 votes

Answer:

A) Women surrendered their jobs to returning soldiers.

Step-by-step explanation:

After the end of World War I, many women who had taken on jobs during the war were asked to return to their traditional roles as homemakers and caregivers. This often meant that they were expected to surrender their jobs to returning soldiers, who were seen as the primary breadwinners of the family. In some cases, women were able to keep their jobs or seek advanced training to get professional jobs, but these opportunities were often limited by gender discrimination and other barriers. Despite these challenges, some women were able to use the skills and experience they gained during the war to get better paying jobs and achieve greater financial independence. Overall, the end of World War I marked a significant shift in the roles and expectations of women in the workforce, with many women facing significant challenges as they tried to balance work and family responsibilities.

User Lsabi
by
2.8k points