Answer:
World war 2 era marked the end of colonialism. If you look at what happened after ww2 you'll see that both in Africa and in Asia, new countries were born from French and British colonies.
This is a direct consequence of the USA being the western superpower, she obliged her former allies to free their colonies. On one hand that has to do with their history, that is to say, a republic born after fighting a king. But it also has an economic explanation; independent countries will probably buy more American goods than colonies.
That worked fine in some countries (Egipt, IrĂ¡n, India) but did not work at all in others: Vietnam, Iraq, Siria, Uganda, etc.
Finally, all this happened because European colonialist countries did not have any power to achieve any different deal with the USA.