0 Comments
Answer:
The war greatly changed the role of women in America. Before the war, women were restricted to a domestic life. This means that women were discouraged
9.4m questions
12.2m answers