13.7k views
2 votes
What happened to German territory in the east after WWI?

User Cereal
by
7.6k points

1 Answer

6 votes

Germany lost World War I. In the 1919 Treaty of Versailles, the victorious powers (the United States, Great Britain, France, and other allied states) imposed punitive territorial, military, and economic provisions on defeated Germany. ... In the east, Poland received parts of West Prussia and Silesia from Germany.

User Chris Simmons
by
7.8k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.