146k views
1 vote
How did the united states end up at war with germany in 1941?

User Xpioneer
by
8.5k points

2 Answers

2 votes

Answer:

A.

After the U.S. declared war on Japan, Germany declared war on the U.S.

Step-by-step explanation:

User Omkar Rajam
by
8.4k points
2 votes
Germany declared war on the US after Japan attacked Pearl Harbor as a measure of solidarity
User Steve Peschka
by
7.7k points