121k views
11 votes
For many European countries, the end of WWI was the beginning of

User BharathBob
by
3.9k points

1 Answer

7 votes

Answer: the peace treaty that officially ended the conflict—the Treaty of Versailles of 1919—forced punitive terms on Germany that destabilized Europe and laid the groundwork for World War II.

Step-by-step explanation:

User Zin Win Htet
by
3.8k points