171k views
2 votes
All of these countries had official colonies in Africa at the end of World War II EXCEPT

User Eurosecom
by
6.2k points

1 Answer

4 votes
All of these countries had official colonies in Africa at the end of World War II except for Germany.
User Mlovic
by
6.2k points