91.3k views
1 vote
After WWI African countries _____________.

A. had revolutions and formed democratic governments
B. were handed over to France and England.
C. were given to Germany to compensate them for the war.
D. Became independent.

2 Answers

4 votes
d. is the answer became independent
User Melih Sevim
by
6.9k points
2 votes

Answer:

B. were handed over to France and England.

Step-by-step explanation:

After WWI, African countries in general continued to be colonies. Most of them only achieved independence after WWII. However, some countries that were previously controlled by Germany and the Ottoman Empire were passed to England and France as a result of the defeat of the Central Powers.

User Ntshembo Hlongwane
by
6.6k points