Final answer:
Spain agreed to sell Florida to the United States because they were occupied with fighting rebels in Latin America.
Step-by-step explanation:
The main reason Spain agreed to sell Florida to the United States was because it was busy fighting rebels in Latin America. The Spanish colonies in Latin America, including Mexico, Venezuela, and Colombia, were seeking independence from Spain, leading to ongoing conflicts. Spain's focus and resources were stretched thin, making it difficult for them to maintain control over distant territories like Florida. Selling Florida to the United States was a way for Spain to alleviate some of their burdens and potentially gain some economic benefits from the sale.
Learn more about Spain's agreement to sell Florida to the United States