Nocemoist3go Nocemoist3go Social Studies Answered What does the term “Southern Redemption” mean? Southerners regained control of their states after Reconstruction. Southerners were forced to pay the costs of the Civil War. The South was freed from the institution of slavery. The South became a fairer, more equal society.