The war of 1812 helped the United States by raising their level of patriotism and nationalism. Even though the US didn't win the war, they didn't lose, and fighting against Britain, the country who once ruled over the US, and not losing, was a great achievement for the us.