81.4k views
13 votes
What happened after the United States was attacked by Japan?

1 Answer

2 votes
possible answer: Its most significant consequence was the entrance of the United States into World War II. The US had previously been officially neutral but subsequently entered the Pacific War, the Battle of the Atlantic and the European theatre of war.

explanation: unsure of context
User Carene
by
5.1k points