74.0k views
11 votes
What happened to United States during wwl

User Scottrakes
by
4.1k points

2 Answers

12 votes
i think c i am not 100% though!
User Airborn
by
4.8k points
4 votes

Answer:

After World War I most Americans concluded that participating in international affairs had been a mistake. They sought peace through isolation and throughout the 1920s advocated a policy of disarmament and nonintervention. As a result, relations with Latin-American nations improved substantially under Hoover, an anti-imperialist.

Step-by-step explanation:

User Streight
by
4.5k points