Final answer:
Florida was controlled by Spain, France, and England before it became a territory of the United States.
Step-by-step explanation:
Before it became a territory of the United States, Florida was controlled by Spain, France, and England.
Learn more about Control of Florida before becoming a territory of the United States