Answer:
It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.
1.6m questions
2.0m answers