Before the attack on Pearl Harbor, Americans were divided about whether or not to enter the war. After World War I ended in 1919, many Americans became isolationists. Isolationism is the belief that the U.S should stay out of distupes between foreign nations. Both Europe and Asia were far away from the U.S. Millons of Americans believed the country could "go it alone" and not get involved in the war.
But Pearl Harbor changed things. The attack shocked Americans, Japan's attack on Pearl Harbor united them. Overnight, the U.S had become a major player in World War II. Isolationism was no longer an option.
(sorry if it's bad. if I'm gonna be honest, I'm just a kid. just starting middle school this year so...)