105k views
3 votes
What even brought the United States into World War II? A. An attack on

User Asia
by
8.8k points

2 Answers

2 votes

Answer:

USA did't want anything to do with World War II until Japan attacked the Pearl Harbor on December 7th, 1941 which brought the USA into world war II!

Step-by-step explanation:

User Tkrishtop
by
9.4k points
4 votes

Answer:

The attack on Pearl Harbor.

User Iya
by
8.5k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.