Answer:
American imperialism is caused by the nation's desire to expand its control and influence in locations overseas. This is accomplished through military, political and even economic prowess. United States imperialism dates back to the 1800s. In the United States, imperialism emerged around the 19th century.
Step-by-step explanation: