5.9k views
0 votes
The United States won the war against Spain and was now considered a::

1 Answer

1 vote

Answer:

World power.

Step-by-step explanation:

United States defeated Spain in the famous Spanish-American War in 1898 and established themselves as a dominant force in Americas. Not only that Spain was defeated, but the country lost all of the colonies in this part of the world. On the other side, United States strengthen their role, as they gained new territories on the Pacific and in Latin America.

User Maxswitcher
by
4.4k points