157k views
0 votes
When the Civil War ended, the Union took control of Florida’s government and.

1 Answer

7 votes

Answer:

On July 25th 1868, after the state ratified amendments to the Constitution to abolish slavery and grant citizenship to former slaves, Florida was fully restored to the United States. The period after the Civil War is known as the Reconstruction period.

Step-by-step explanation:

User Itsathere
by
4.9k points