123k views
2 votes
How did world war ii change the role of corporations in american life?

User Rubndsouza
by
7.2k points

2 Answers

4 votes
Hey there,
World War II transformed the role of the federal government and the lives of American citizens.They secure industrial peace and stabilize war production, the federal government forced reluctant employers to recognize unions.

Hope this helps :))

~Top
User Zach Russell
by
7.7k points
2 votes
World War II changed the role of corporations in americans life by stopping all the wars and just moving on with there lives and they didn't know what to do after that. Hope this helped, have a great day! :D
User Tszming
by
7.2k points