236,172 views
8 votes
8 votes
PLEASE HELP IM FAILING!!

Following the War of 1812 the United States gained the respect of European nations.



True
False

User Craig Graham
by
2.8k points

2 Answers

15 votes
15 votes
Answer : True Explanation: I’ve taken the test before !
User Slugart
by
2.9k points
8 votes
8 votes

Answer:

It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.

User Zooes
by
3.1k points