158,158 views
15 votes
15 votes
Which effect did world war 2 have on culture in the United States

User Simontemplar
by
2.9k points

1 Answer

9 votes
9 votes
Following World War II, the United States emerged as one of the two dominant superpowers, turning away from its traditional isolationism and toward increased international involvement. The United States became a global influence in economic, political, military, cultural, and technological affairs.


Does this help if not tell me please
User Shacole
by
3.2k points