8.9k views
1 vote
World war 2 caused the end of european imperialism true or false

2 Answers

4 votes

Answer:

false

Step-by-step explanation:

User Alexey Savchuk
by
8.0k points
3 votes
This is technically false. Although the end of World War II brought with it a complete rearrangement of Europe, and far more interdependence between European nations than before the war, there were still small cravings of imperialism from several European states.
User Kasean
by
8.1k points

No related questions found