196k views
5 votes
How did the United States impact Japan after WW2 ended?

1 Answer

3 votes
After the defeat of Japan in World War II, the United States led the Allies in the occupation and rehabilitation of the Japanese state.
User Nikola
by
7.5k points

No related questions found