World War II is the historical event which was the most important in forming the character of modern Germany.
As you already know, Germany was very much involved in WWII, after which its character changed quite a lot. The Germany we know today differs greatly from that before the War, which was instrumental in changing this country for the better.