Answer:
It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.
9.3m questions
12.0m answers