217k views
1 vote
What happened to jobs for women after the end of World War I?

Women surrendered their jobs to returning soldiers.

Women sought advanced training to get professional jobs.

Women formed labor unions to fight discrimination in the workplace.

Women were able to get better paying jobs using skills learned while working during the war.

User Pickels
by
8.1k points

2 Answers

4 votes

Answer:

Women surrendered their jobs to returning soldiers.

Explanation: took the test. This is right

User OmriToptix
by
7.5k points
4 votes
Women formed labor unions to fight discrimination in the workplace.
User Khinsen
by
7.5k points