118k views
3 votes
World War II caused the end to European imperialism. True or False?

2 Answers

4 votes
I believe the answer is false, World War II did not cause the end to European imperialism.
User Kashon
by
8.0k points
4 votes
This is technically false. Although the end of World War II brought with it a complete rearrangement of Europe, and far more interdependence between European nations than before the war, there were still small cravings of imperialism from several European states.
User Gonzalez
by
7.6k points