Final answer:
Great Britain took control of Florida from Spain at the end of the French and Indian War through the Treaty of Paris.
Step-by-step explanation:
At the end of the French and Indian War, Great Britain took control of Florida from Spain. During British control, Florida was divided into two separate colonies: East Florida and West Florida. East Florida included the peninsula that makes up most of the state today, while West Florida encompassed the panhandle region. The British presence in Florida led to the establishment of new settlements and the growth of the economy, particularly in agriculture and trade.
The Treaty of Paris, signed in the year 1763, marked the end of the war and resulted in France surrendering all its North American territory, including Florida, to Britain. This victory made Great Britain the dominant power in eastern North America and solidified their control over the region.