The treaty of Versailles made Germany pay for the costs of the war, and it also banned them from having a strong military. Since Germany lost, their economy collapsed and they got into a depression. It wasn't until Adolf Hitler got into power when things got better.
Have a nice day! :)