75.1k views
0 votes
What is one way that Japan changed following World War II?​

User MikaelF
by
7.2k points

1 Answer

5 votes

Japan was defeated in World War 2, which caused the United States to lead the Allies in the occupation and rehabilitation of the Japanese state. Between 1945 and 1952, the U.S. occupying forces enacted widespread military, economic, political and social reforms.

User Genjix
by
7.5k points