38.0k views
1 vote
Between ww1 and ww2 which war do you think the united states had more to gain by winning

1 Answer

5 votes

Answer:

ww2

Step-by-step explanation:

In ww1, the US was sort of isolationist but did support the British and French with some arms and people. The US joined very late into the war by joining it on the last year, and so America didn't really have any ambitions of subjugating anything under its own domain, but they did make the league of nations which did almost nothing since America wasn't part of it ironically enough.

In ww2, I'm sure you're aware of Germany being able to conquer most of western and eastern Europe or just Europe itself at this point under its own domain, but the UK still wasn't under any German control and this ended up being a costly war for them. The US was a sleeping giant trying to deal with the Great Depression, but it was safe to say the giant woke up when it was attacked by the Japanese during Peral Harbor. The US was able to push the Germans to berlin along side with the UK, Soviet, and some French troops. I'm also sure that you know what the cold war, and at this point Western Europe's only defend was the US which means that almost all the Western European nations had to ask the US to do something. It was also able to occupy all of Japan, so that was also a big win.

User Zaxliu
by
4.4k points