Answer:
It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.
8.2m questions
10.8m answers