Final answer:
Florida was controlled by Spain, France, and England before becoming a territory of the United States.
Step-by-step explanation:
Before becoming a territory of the United States, Florida was controlled by Spain, France, and England.
Florida was initially claimed by Spain, and it remained under Spanish control for a long time until it was ceded to Britain during the Seven Years' War. However, Spain regained control of Florida after the American Revolution. Lastly, in 1819, Spain ceded Florida to the United States through the Adams-OnĂs Treaty.
Learn more about Control of Florida before becoming a territory of the United States