192k views
4 votes
What social changes took place in the United States after WWII? What role did the war play in those changes? PLEASE HURRY!!

User Scott Lin
by
4.4k points

1 Answer

3 votes

Answer:

After World War II, the United States was in a great economical position than it was before war. After the war is when the United States became a global superpower as it. The United States became a global influence in economic, political, military, cultural, and technological affairs.

User Ratsbane
by
5.0k points