208k views
5 votes
What happened in Japan during the Cold War

User Kevin Peno
by
3.6k points

1 Answer

7 votes
Allied forces headed by the United States conquered the country during World War II, bringing about significant changes. Japan was disarmed, its empire was disbanded, its government was transformed into a democracy, and its economy and educational system were restructured and rebuilt.
User Jortizromo
by
3.5k points