101k views
2 votes
Discuss how you think life in the South was different after the war.

1 Answer

0 votes

When the Civil War ended, the Union was restored and slavery came to end. It also put an end to the plantation class who ruled Southern society. Many farms and building were destroyed when the North invaded the South. Without their slaves, no one was there to work their fields or take of their homes. Many were now left poverty and would take time for them to recover what was lost in the war.

User Vidhi
by
7.1k points