105k views
3 votes
What even brought the United States into World War II? A. An attack on

User Asia
by
4.7k points

2 Answers

2 votes

Answer:

USA did't want anything to do with World War II until Japan attacked the Pearl Harbor on December 7th, 1941 which brought the USA into world war II!

Step-by-step explanation:

User Tkrishtop
by
4.7k points
4 votes

Answer:

The attack on Pearl Harbor.

User Iya
by
4.6k points