162k views
0 votes
How did the united States gain the territory if Florida

User JeCh
by
8.6k points

1 Answer

2 votes
In 1763 the Treaty of Paris was signed by England, France and Spain and it resulted in England gaining the Florida Territory. But when England formally recognized the colonies' independence (as the United States) in 1783, the Florida Territory was returned to Spain without clear definition of its boundaries.
User Prajwol Onta
by
7.9k points

No related questions found