9.7k views
2 votes
Which event finally brought the United States into World War II?

a. Japan's attack on Pearl Harbor
b. Germany's invasion of France

c. Britain's attack on Gibraltar

d. Italy's invasion of Greece

1 Answer

3 votes
It was "a. Japan's attack on Pearl Harbor" that finally brought the United States into World War II, since this was a direct attack on a United States military establishment, which was an indisputable act of war. 
User Liamf
by
7.4k points