163k views
2 votes
!!!!! DUE IN 4 MINUTES!!!!!!!!!!!! URGENT !!!!!!! HELP!!!!!!!!! FAST!!!!!!!!!!!!! NOW!!!!!!!!!!!!!!!

Why do you feel after the War of 1812 the United States is now more respected throughout Europe and the rest of the world? The US is seen as its own self-governing established country without much foreign assistance; give two reasons to support this. Unless you don't feel that it is t the case thus give evidence that says otherwise.

User Chrixian
by
4.2k points

1 Answer

5 votes

Answer:war had a far-reaching impact in the United States, as the Treaty of Ghent ended decades of bitter partisan infighting in government and ushered in the so called “Era of Good Feelings.” The war also marked the demise of the Federalist Party, which had been accused of being unpatriotic for its antiwar

Step-by-step explanation:

User Jon Archway
by
4.5k points