229k views
5 votes
Did America need to be an imperial nation?

Can someone break down this question for me? I have trouble understanding :(

User Rul
by
3.9k points

1 Answer

6 votes

Answer:

The United States became an imperial power through land acquisition and war. The term that can be used to describe the ideology of becoming an imperial power is called "Manifest Destiny". It means "the belief of many that the United States was destined to control all of the land between the two oceans". With this belief, the United States participated in land expansion westward and around the world (particularly in Latin America). One significant war that made the United States become more of an official imperial power was the Spanish-American War. By winning this war, the US acquired possession in the Philippines, Guam, and Hawaii. When Theodore Roosevelt became the president in 1900, he helped the United States to become more of an imperial power by getting involved in Latin American affairs (the Roosevelt Corollary), building the Panama Canal, and displaying his Great White Fleet of navel ships around the world. By the time the United States entered World War I, it had already become an increasing imperial power. By the end of World War II, the United States was officially known as a superpower.

In the late nineteenth century, the United States abandoned its century-long commitment to isolationism and became an imperial power. ... Both a desire for new markets for its industrial products and a belief in the racial and cultural superiority of Americans motivated the United States' imperial mission.

One major reason that the United States became an imperial power at this time is due to economic prosperity. The country had quickly and successfully been developing its industrial sector throughout the 19th Century. This afforded the country the resources and capital necessary to extend its power overseas.

User Pantera
by
4.5k points