33.9k views
2 votes
How has the us foreign policy impacted Africa Asia the carribean Latin America and the Middle East?

User Rpstw
by
4.8k points

1 Answer

7 votes

Answer:

The U.S.'s foreign policy after WWI was to become isolationists. They said they would not supply any country with weapons.

User Kriver
by
5.6k points