319,803 views
32 votes
32 votes
PLEASE HELPPP MEEE PLSSS!!!

How did the Spanish American War change US foreign policy?

1.It led to greater trade in Asian nations

2.Yellow journalism led to foreign alliances

3.The United States began controlling colonies outside of its borders

4.the war ushered in a decade of neutrality in the Western Hemisphere

User Ba
by
2.7k points

2 Answers

16 votes
16 votes

Answer:

I'm thinking 2

Step-by-step explanation:

I searched it and America's foreign policy changed from isolationism to imperialism during the Spanish-American war. America was now willing and able to help out in foreign affairs around the world to expand its empire

I'm confident in 2!

Hope this helps, GretaVanFleetRocks

User Thegio
by
3.2k points
18 votes
18 votes

Answer:

Americas foreign policy changed from isolationism to imperialism during the spanish-american war. America was now willing and able to help out in foreign affairs around the world to expand its empire. How did the United States develop an overseas empire? They annexed Guam, Puerto Rico, the Philippines and Cuba.

Read this and I hope you find for answer :)

User Loentar
by
2.4k points