127k views
1 vote
Who owned Florida when America bought it?

User TheDrot
by
8.8k points

1 Answer

4 votes
The owners of Florida when America bought it were Great Britain and Spain. Hope this helps you:)
User David Herrero
by
7.5k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.