Answer:
“American imperialism” is a term that refers to the economic, military, and cultural influence of the United States on other countries. ... During this time, industrialization caused American businessmen to seek new international markets in which to sell their goods.
Step-by-step explanation: