175k views
4 votes
After World War II, how did Americans view the role of the United States?

User MotohawkSF
by
2.8k points

1 Answer

12 votes

Answer:

Following World War II, the United States emerged as one of the two dominant superpowers, turning away from its traditional isolationism and toward increased international involvement. The United States became a global influence in economic, political, military, cultural, and technological affairs.

User Shateek
by
3.5k points