192k views
4 votes
What social changes took place in the United States after WWII? What role did the war play in those changes? PLEASE HURRY!!

User Scott Lin
by
7.3k points

1 Answer

3 votes

Answer:

After World War II, the United States was in a great economical position than it was before war. After the war is when the United States became a global superpower as it. The United States became a global influence in economic, political, military, cultural, and technological affairs.

User Ratsbane
by
8.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.