115k views
2 votes
What officially brought the US into WWll? (and when did it happen)

User Aristotll
by
4.7k points

1 Answer

4 votes

Answer:

The attack on Pearl Harbor brought the U.S. into WWII. It occurred when the Japanese air forced bombed the American fleet in Pearl Harbor (Hawaii) on December 7, 1941.

Step-by-step explanation:

User Mythz
by
4.6k points