Answer:
C & D
Step-by-step explanation:
Imperialism is defined as the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas (Merriam-Webster). In plain English, this means that a country wants to extend its influence over surrounding countries through economic, military, or political means. Furthermore, they want more colonies -- which led many European countries to scourge for Africa.