Correct answer: The end of World War II.
European nations had established colonial empires in Africa in the late 19th and early 20th century. Coming out of World War II, those European nations lacked the resources and energy to maintain far-away empires in Africa and Asia (as well as the ability to suppress revolutions there). Plus they were now aligning themselves with either the USA or the USSR in the new Cold War world that was taking shape.
Additional factors: Coming out of the period of the World Wars, there was a prevailing attitude among the Allies that people should be allowed self-determination of their nations and governments. In other words, they were leaving behind attitudes favoring colonialism and imperialism, believing all nations should be able to determine their own destinies. Coupled with that were rising nationalistic attitudes in African nations that had been dominated by European imperialism. Those factors, along with the economic realities in Europe, started the trend toward decolonization.