226k views
4 votes
Unlike some other countries, the united states does not teach patriotism in the schools.

User Howderek
by
9.0k points

1 Answer

1 vote

The answer is true. Although schools in the United States have taught students to work in a self governing democracy, they didn’t teach students about patriotism or the way of loving their own country as they have their own way of appreciating their own country without having to discuss patriotism or promoting it to the schools.

User Mohax
by
8.7k points