22.2k views
5 votes
Why did americans realize after the war of 1812?

User Badiboy
by
5.3k points

1 Answer

5 votes

Answer:

A sense of nationalism was adopted by Americans as well and unity certainly seemed to be present. Although the country became unified in many trivial ways after the War of 1812, for the most part, the United States actually became more divided.

Step-by-step explanation:

User Gorn
by
5.5k points