79.6k views
2 votes
Which of the following was true about germany after World War I

User Alfy
by
7.1k points

1 Answer

4 votes
After World War I, Germany was forced to accept responsibility for the war and was stripped of its overseas colonies and a large portion of its territory. It was also forced to pay large reparations to the Allied Powers, and its military was severely restricted. Germany was also humiliated in the peace treaty, which had a devastating effect on its citizens and resulted in a great deal of resentment and bitterness towards the Allied Powers.
User Britany
by
7.5k points