201k views
1 vote
Select the best definition for Imperialism below

Imperialism was a historical time period when European powers were colonized by weaker nations


Imperialism was a historical time period when Europe, the United States, and Japan took over weaker nations for their benefit.


Imperialism was right after the Great Depressions.


Imperialism only about civilizing non-Christians around the world.

1 Answer

2 votes
The correct answer is Imperialism was a historical time period when Europe, the United States, and Japan took over weaker nations for their benefit.

Europeans were imperialists when it comes to colonizing places like South America and Africa, while Japan was imperial in Asia in places like China and Korea. United States didn't fight on other continents but rather went westward towards the Pacific Ocean taking out anyone who stood in their way.
User Christian Findlay
by
7.5k points

No related questions found