204k views
6 votes
Did the end of World War lead to World War II?

2 Answers

7 votes

Answer:Germany had formally surrendered on November 11, 1918, and all nations had agreed to stop fighting while the terms of peace were negotiated. On June 28, 1919, Germany and the Allied Nations (including Britain, France, Italy and Russia) signed the Treaty of Versailles, formally ending the war.this is how world war 1 ended.

Step-by-step explanation:

User Bhavesh
by
3.8k points
3 votes

Answer:

hope this helps :)

Explanation:

WWI ended with Germany signing the Treaty of Versailles. Germany was forced to sign this treaty, because if they did not sign the treaty, then they would be attacked. There was essentially no compromising. ... Germany joining the war brought several other countries into the war, and made it into a full World War.

User GoRGon
by
3.3k points