156k views
3 votes
The war of 1812 helped gain respect for the united states from the nations of europe True or False

User OnCreate
by
8.7k points

2 Answers

2 votes

Answer:

false

Step-by-step explanation:

User Naderabdalghani
by
8.0k points
5 votes

False, Europe still hated them.

User Paul Farnell
by
7.5k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.