11.3k views
1 vote
After world war 2 how did Americans view the role of the United states

2 Answers

1 vote
the entry of the United States into World War II cause vast changes in virtually most Americans intitle e view their place in the postwar world with optimism
User MPaulo
by
7.0k points
1 vote

Many wanted the U.S. to retreat from global responsibilities.

User Twinmind
by
8.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.