120k views
9 votes
PLEASE HELP IM FAILING!!

Following the War of 1812 the United States gained the respect of European nations.



True
False

2 Answers

5 votes
Answer : True Explanation: I’ve taken the test before !
User Andrew Cumming
by
6.6k points
3 votes

Answer:

It's true. The War of 1812 helped gain respect for the United States from the nations of Europe.

User David M Smith
by
5.6k points