187k views
5 votes
How did working during world war || change american women

User Calmrat
by
6.7k points

1 Answer

4 votes

Answer:

Though women had been joining the work force in greater numbers since the hardships of The Great Depression, the entry of the United States into World War II completely transformed the types of jobs open to women. Before the war, most working women were in traditionally female fields like nursing and teaching.

Step-by-step explanation:

User Ankur Mishra
by
6.0k points