Final answer:
The U.S. entered WWII following the attack on Pearl Harbor, which shifted the country from isolationism to actively participating alongside the Allies. The war effort intensified the U.S.'s industrial production and led to significant social changes domestically, including advances in civil rights.
Step-by-step explanation:
Overview of U.S. Involvement in World War II
The United States joined World War II after the attack on Pearl Harbor on December 7, 1941. This event jolted America from its isolationist stance and launched it into full-scale involvement alongside Allied powers. American industrial and military might turned it into the 'Arsenal of Democracy,' providing critical support to allies. Victory in Europe (VE) Day marked Nazi Germany's surrender, while Victory in Japan (VJ) Day marked the end of the war following the atomic bombings on Hiroshima and Nagasaki. The U.S.'s involvement in WWII also had significant social impacts, boosting economic growth and initiating civil rights movements as women and minorities played crucial roles on the home front and demanded equal rights postwar.