61.5k views
5 votes
How would you describe the United States after the War of 1812? Would you say they are United as one or against each other after the war? Make sure to give reasons for your answer.

User Jia
by
5.5k points

2 Answers

13 votes
I honestly have no idea Honestly I have no idea i just need points
User Aphex
by
4.9k points
6 votes
The Federalists and Republicans reversed political roles.Political leaders realized the importance of a strong military.The United States became more industrialized.The United States took over vast amounts of Native American land. At home, Americans saw the war as a second war of independence.The war helped the United States establish economic independence as cotton mills and other manufacturing industries were constructed throughout the nation. The United States also established itself as a world power.
User AbiusX
by
4.8k points