Answer:
It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.
8.3m questions
11.0m answers