122k views
4 votes
Describe social, economic, and political changes occurred after World War II?

1 Answer

3 votes
The world war 2 brought the U.S. together socially, it made America a supper power, it brought the us out of a recession, and unfortunately led the U.S closer to socialism.

I hope my answer has come to your help. God bless and have a nice day ahead!
User Ashante
by
7.7k points