After the War of 1812, Americans A "gained a renewed sense of pride in their country." There were no territorial gains in the war, however it solidified Americas ability to hold off the British, and therefore increased national pride.
6.5m questions
8.7m answers