Answer:
A review of US Imperialism is explained below in complete details.
Step-by-step explanation:
“American imperialism” is a phrase that applies to the economic, militaristic, and cultural significance of the United States on different nations. First universalized during the administration of James Polk, the thought of an “American Empire” was made an actuality completely the following half of the 1800s.