38.9k views
0 votes
How did the role of women in the United States change during and after World War 2

1 Answer

6 votes

Answer:

Explanation:women’s started to fight back not all women wanted to just sit home and clean

User Indigenuity
by
5.1k points