48.3k views
3 votes
(I want your opinion. )

What do you think Germany teaches their kids about World War 1 and World War 2?


Do you think they teach their kids that Germany was good and that Germany should have won?

or

Do you think they teach their kids that Germany was wrong and they shouldn't have done those things, and they deserve the blame for all of it?

User Fawn
by
7.2k points

1 Answer

5 votes
i think they do abit of both as they want to make the country look good but also want to show kids it was a very wrong and cruel thing to do
User Jbgt
by
7.7k points