74.0k views
11 votes
What happened to United States during wwl

User Scottrakes
by
8.5k points

2 Answers

12 votes
i think c i am not 100% though!
User Airborn
by
8.7k points
4 votes

Answer:

After World War I most Americans concluded that participating in international affairs had been a mistake. They sought peace through isolation and throughout the 1920s advocated a policy of disarmament and nonintervention. As a result, relations with Latin-American nations improved substantially under Hoover, an anti-imperialist.

Step-by-step explanation:

User Streight
by
8.4k points

Related questions

1 answer
2 votes
53.1k views
asked Aug 14, 2024 170k views
Rohit Funde asked Aug 14, 2024
by Rohit Funde
7.0k points
1 answer
1 vote
170k views