Yes, they did.
The WWI was concluded by a treaty that demanded big reparations from the countries that lost the war, which later burdened them financially and contributed to another war.
The WWII was concluded with attempts to unite the countries more and strengthen them economically, to avoid further conflicts.
They decided on this change because the terms of the Treaty of Versailles is blamed for bringing about another war, rather than peace.