Answer:
American imperialism
Step-by-step explanation:
Imperialism is the ideology of a country to exercise control, over another country through military force or sometimes through diplomacy. When we say American imperialism, it is America's policy to control foreign countries. Many factors explain American imperialism which includes the threat to the economic interest of the private corporations of the United States in foreign land resulting in unequal treaties or military conquest.