Final answer:
Japan brought the United States into World War 2.
Step-by-step explanation:
The country that brought the United States into World War 2 was Japan. The attack on Pearl Harbor on December 7, 1941, prompted the United States to declare war on Japan the following day, which then led to the US involvement in the war.
Learn more about United States' involvement in World War 2