Allied forces headed by the United States conquered the country during World War II, bringing about significant changes. Japan was disarmed, its empire was disbanded, its government was transformed into a democracy, and its economy and educational system were restructured and rebuilt.