115k views
2 votes
PLEASE HELPPP MEEE PLSSS!!!

How did the Spanish American War change US foreign policy?

1.It led to greater trade in Asian nations

2.Yellow journalism led to foreign alliances

3.The United States began controlling colonies outside of its borders

4.the war ushered in a decade of neutrality in the Western Hemisphere

User Underyx
by
5.3k points

2 Answers

6 votes

Answer:

I'm thinking 2

Step-by-step explanation:

I searched it and America's foreign policy changed from isolationism to imperialism during the Spanish-American war. America was now willing and able to help out in foreign affairs around the world to expand its empire

I'm confident in 2!

Hope this helps, GretaVanFleetRocks

User Kundu
by
6.1k points
1 vote

Answer:

Americas foreign policy changed from isolationism to imperialism during the spanish-american war. America was now willing and able to help out in foreign affairs around the world to expand its empire. How did the United States develop an overseas empire? They annexed Guam, Puerto Rico, the Philippines and Cuba.

Read this and I hope you find for answer :)

User Sahbeewah
by
5.6k points