Final answer:
The War of 1812 had effects on the United States including falling into major debt and gaining international respect.
Step-by-step explanation:
The War of 1812 had several effects on the United States. One of the major effects was that the United States fell into major debt. The cost of the war and the disruption of trade led to a financial crisis. Another effect was that the United States gained international respect. The war demonstrated that the United States was willing to defend itself and that it had the capability to compete with a major world power like Britain. However, there is no evidence to support the claim that Britain gave Florida to the United States as a direct result of the War of 1812.
Learn more about Effects of the War of 1812