191k views
5 votes
How did World War II transform American society?

1 Answer

2 votes
Hey there,
1) WWII helped the United States out of the Great Depression by providing arms to other countries.
2) Women were forced to work as there were not enough men in the work place

Hope this helps :))

~Top
User TmTron
by
8.7k points