50.4k views
3 votes
After the United States declared war

on Japan, what nation declared war
on the United States?

User Pozuelog
by
5.5k points

2 Answers

2 votes

Final answer:

Germany and Italy declared war on the United States on December 11, 1941, following the U.S. declaration of war on Japan, officially bringing the U.S. into World War II.

Step-by-step explanation:

After the United States declared war on Japan, following the attack on Pearl Harbor on December 7, 1941, it was Germany and Italy that declared war on the United States. This occurred on December 11, 1941, as a direct consequence of the Tripartite Pact, which bound Japan, Germany, and Italy in a defensive alliance. Germany's and Italy's declarations of war meant that the United States had officially entered into what would become known as World War II, engaged against the Axis powers on multiple fronts.

User JStead
by
5.5k points
1 vote

Answer: Germany and Italy

Explanation: Germany and Italy declared war and that is how World War 2 began

User Lejiend
by
6.0k points