174k views
1 vote
How did American industry change from the beginning of WW2 to its end?

User Epattaro
by
4.3k points

1 Answer

4 votes

Answer:

During the war,the roles of women became more important,due to the fact that the men were sent away to either fight or make war efforts,causing women and their actions to be more recognised,and empowering them,even after the war.

Step-by-step explanation:

hope this helps

User Stas Boyarincev
by
4.8k points