51,345 views
24 votes
24 votes
Ow did the US change the government of Japan after World War II?

User Sukhjinder Singh
by
2.5k points

2 Answers

6 votes
6 votes

Answer:

After Japan surrendered in 1945, ending World War II, Allied forces led by the United States occupied the nation, bringing drastic changes. Japan was disarmed, its empire dissolved, its form of government changed to a democracy, and its economy and education system reorganized and rebuilt.

User Sam Yates
by
2.8k points
11 votes
11 votes

It created a democratic government.

User Ward Segers
by
3.1k points