38.9k views
0 votes
How did the role of women in the United States change during and after World War 2

1 Answer

6 votes

Answer:

Explanation:women’s started to fight back not all women wanted to just sit home and clean

User Indigenuity
by
7.7k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.