181k views
0 votes
What did the US to with Japan after WW2?

User DeoKasuhal
by
6.3k points

1 Answer

4 votes

Answer: After the defeat of Japan in World War II, the United States led the Allies in the occupation and rehabilitation of the Japanese state.

Explanation: hope this helps

User Ryan Searle
by
6.3k points