115k views
2 votes
What officially brought the US into WWll? (and when did it happen)

User Aristotll
by
7.9k points

1 Answer

4 votes

Answer:

The attack on Pearl Harbor brought the U.S. into WWII. It occurred when the Japanese air forced bombed the American fleet in Pearl Harbor (Hawaii) on December 7, 1941.

Step-by-step explanation:

User Mythz
by
7.8k points

Related questions

asked May 2, 2019 120k views
Wallybh asked May 2, 2019
by Wallybh
7.6k points
1 answer
0 votes
120k views
asked Jan 4, 2024 24.9k views
Adam Colvin asked Jan 4, 2024
by Adam Colvin
7.6k points
1 answer
5 votes
24.9k views
asked Dec 7, 2024 210k views
FishStix asked Dec 7, 2024
by FishStix
7.5k points
1 answer
1 vote
210k views
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.