Answer: The main goal of colonizing West Africa was that they wanted to turn West African countries into a “French-state”. This means changing their way of living, making the official language French, making them convert into a new religion like Christianity. The French colonization changed the African culture.
Explanation: Hope this helped :)