175k views
4 votes
After World War II, how did Americans view the role of the United States?

User MotohawkSF
by
7.8k points

1 Answer

12 votes

Answer:

Following World War II, the United States emerged as one of the two dominant superpowers, turning away from its traditional isolationism and toward increased international involvement. The United States became a global influence in economic, political, military, cultural, and technological affairs.

User Shateek
by
8.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.