44.7k views
2 votes
World War II caused the end of European imperialism. True False

User Ji Mun
by
8.0k points

2 Answers

5 votes

Answer:

False.

Step-by-step explanation:

literally g00gle the whole question, its is false.

User Shusen Yi
by
7.5k points
6 votes

Answer:

True

Step-by-step explanation:

While World War I brought an end to some empires in Europe, not all were gone. There were still some empires present at the start of World War II. After it has ended the colonial powers declined and most of Europe became interdependent between themselves. It did not end it completely but it made them mostly disappear.

User Katinka
by
9.1k points