152k views
3 votes
What have we learned that contributes to you understanding of America as an imperialistic nation?

1 Answer

4 votes

Answer:

“American imperialism” is a term that refers to the economic, military, and cultural influence of the United States on other countries. ... During this time, industrialization caused American businessmen to seek new international markets in which to sell their goods.

Step-by-step explanation:

User Render
by
5.1k points