American imperialism is the term for a policy aimed at extending the political, economic, and cultural control of the United States government over areas beyond its boundaries. Depending on the commentator, it may include military conquest, gunboat diplomacy, unequal treaties, subsidization of preferred factions, economic penetration through private companies followed by intervention when those interests are threatened, or regime change.