25,625 views
12 votes
12 votes
For many European countries, the end of WWI was the beginning of

User Atiar Talukdar
by
2.8k points

1 Answer

25 votes
25 votes

Answer: the peace treaty that officially ended the conflict—the Treaty of Versailles of 1919—forced punitive terms on Germany that destabilized Europe and laid the groundwork for World War II.

Step-by-step explanation:

User Tal Haham
by
3.0k points