Many people believe the United States should of helped Germany after World War One. After WW1 the Treaty of Versailles created a great depression in Germany, because of the aggressive reparations. Many historians believe that the end of WW1 caused World War 2.