125k views
2 votes
What eventually brought the
United States into World War II

User Jhornnes
by
6.8k points

1 Answer

3 votes

Answer: The United States eventually entered World War II as a direct result of the surprise attack by Japan on the U.S. naval base at Pearl Harbor, Hawaii on December 7, 1941. The attack resulted in the deaths of over 2,400 Americans and the destruction of much of the Pacific fleet. The following day, President Franklin D. Roosevelt delivered his famous "Day of Infamy" speech to Congress, and declared war on Japan. The U.S. entry into the war against Japan led to declarations of war against the U.S. by Germany and Italy, bringing the U.S. fully into World War II.

Step-by-step explanation:

User Bergmeister
by
7.5k points