Answer:
It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.
6.5m questions
8.6m answers