The answer to this question is "Planters, who were able to expand plantations and Southern culture into Florida"
A whole heap more! Once the U.S. acquired Florida from Spain, there was more land available. The planters saw this as a great opportunity to expand their plantations, as well as their Southern culture throughout the state of Florida.