135k views
3 votes
What major change took place around the world after world war II​

What major change took place around the world after world war II​-example-1
User Sung Kim
by
7.7k points

2 Answers

4 votes

Answer:

Former Colonial Nations Became Independent.

Step-by-step explanation:

The Answer would be B. If you go back in the study it tells you that they became independent. That's how I got my answer correct. So if you get stuck Go back to the study in your head to see if they did talk about that in there.

User Nirel
by
7.4k points
2 votes

The correct answer is - B. Former colonial nations became independent.

After the World War II, the nations that were under the rule of the colonial powers, one by one started to gain independence. That was a result of multiple reasons. One reason was the economically and military exhausted nations that were the colonial powers, thus they had to focus on their own rebuilding and development, and were not able to invest and control in the colonies. The international pressure was constantly increasing, as well as the formation of numerous independence movements. There was lot of revolt and desire for independence among the people in these nations. Considering everything, the colonial powers simply granted these nations independence without any military conflicts being involved.

User Olex
by
7.2k points