132k views
3 votes
What does the term “Southern Redemption” mean?

Southerners regained control of their states after Reconstruction.

Southerners were forced to pay the costs of the Civil War.

The South was freed from the institution of slavery.

The South became a fairer, more equal society.

User Raymund
by
4.3k points

2 Answers

0 votes

Answer:

A

Step-by-step explanation:

User Musthero
by
4.0k points
5 votes

Answer: a: Southerners regained control of their states after Reconstruction.

Step-by-step explanation:

User Jared S
by
4.2k points