The Civil War had a major impact on the United States politically, socially, and economically. And as a result of the Civil War, American politics changed. After the Civil War, the Republicans controlled politics in our country through Reconstruction.