156k views
3 votes
The war of 1812 helped gain respect for the united states from the nations of europe True or False

User OnCreate
by
8.8k points

2 Answers

2 votes

Answer:

false

Step-by-step explanation:

User Naderabdalghani
by
8.0k points
5 votes

False, Europe still hated them.

User Paul Farnell
by
7.5k points