88.9k views
5 votes
How did Germany emerge from defeat in the First World War?

1 Answer

2 votes

Answer:

Germany emerged from the First World War defeated and in political and economic turmoil. The economy was ruined and the Kaiser had fled the country. The Weimar government, set up after the War, was having trouble controlling the country and was very unpopular for accepting the Treaty of Versailles

User Nepho
by
3.9k points