When the Civil War ended, the Union was restored and slavery came to end. It also put an end to the plantation class who ruled Southern society. Many farms and building were destroyed when the North invaded the South. Without their slaves, no one was there to work their fields or take of their homes. Many were now left poverty and would take time for them to recover what was lost in the war.