144k views
5 votes
How did the US claim Florida? Who owned Florida in the first place? Why did they give up the land to the US?

2 Answers

1 vote

Answer:

Ikd Florida is the male genitalia for the U.S.

Step-by-step explanation:

YW

User BenoitVasseur
by
4.3k points
7 votes

Answer:

The U.S. purchased Florida with the signing of Florida Purchase Treaty. Spain and Britain owned Florida at first. Spain did not need Florida, and it was burden to them. So they got rid of it with the U.S.

Step-by-step explanation:

I hope this helps.

User Adrian Klaver
by
4.9k points