144k views
0 votes
What country brought the United States into World War 2?

Germany
Soviet Union (formerly known as Russia)
France
Japan

1 Answer

4 votes

Final answer:

Japan brought the United States into World War 2.


Step-by-step explanation:

The country that brought the United States into World War 2 was Japan. The attack on Pearl Harbor on December 7, 1941, prompted the United States to declare war on Japan the following day, which then led to the US involvement in the war.


Learn more about United States' involvement in World War 2

User Spiralx
by
7.7k points