193k views
1 vote
How did the United States change the Japanese government after World War II?

1 Answer

6 votes

Answer:

After Japan surrendered in 1945, ending World War II, Allied forces led by the United States occupied the nation, bringing drastic changes. Japan was disarmed, its empire dissolved, its form of government changed to a democracy, and it’s economy and education system reorganized and rebuilt.

Step-by-step explanation:

User Gerben Jongerius
by
4.1k points