133k views
0 votes
Why did the United States emphasize the war against Germany rather than Japan? Please be as thorough as possible

User Weivall
by
3.2k points

1 Answer

6 votes

Answer:

US emphasized the war because germany sank the pearl harbor ship when they claimed there was going to be peace. Japan was on US side

Step-by-step explanation:

User Nick Katsivelos
by
3.4k points