189,252 views
10 votes
10 votes
After the United States declared war

on Japan, what nation declared war
on the United States?

User Kirill Chatrov
by
2.6k points

2 Answers

20 votes
20 votes

Final answer:

Germany and Italy declared war on the United States on December 11, 1941, following the U.S. declaration of war on Japan, officially bringing the U.S. into World War II.

Step-by-step explanation:

After the United States declared war on Japan, following the attack on Pearl Harbor on December 7, 1941, it was Germany and Italy that declared war on the United States. This occurred on December 11, 1941, as a direct consequence of the Tripartite Pact, which bound Japan, Germany, and Italy in a defensive alliance. Germany's and Italy's declarations of war meant that the United States had officially entered into what would become known as World War II, engaged against the Axis powers on multiple fronts.

User Knowingpark
by
2.8k points
23 votes
23 votes

Answer: Germany and Italy

Explanation: Germany and Italy declared war and that is how World War 2 began

User Jordan Ryan Moore
by
2.6k points