123k views
17 votes
How did world war ii transform the united states domestically and change its relationship with the world?.

User Tdedecko
by
4.3k points

1 Answer

9 votes

Answer:

Step-by-step explanation:

Following World War II, the United States emerged as one of the two dominant superpowers, turning away from its traditional isolationism and toward increased international involvement. The United States became a global influence in economic, political, military, cultural, and technological affairs.

User Valentin Jacquemin
by
5.2k points