21.7k views
2 votes
Should the united states have helped germany recover from world war one?

1 Answer

3 votes
Many people believe the United States should of helped Germany after World War One. After WW1 the Treaty of Versailles created a great depression in Germany, because of the aggressive reparations. Many historians believe that the end of WW1 caused World War 2. 
User Ablemike
by
8.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.