155k views
5 votes
What did the democrats believe in after the civil war

User Pudgeball
by
7.8k points

1 Answer

4 votes

Final answer:

After the Civil War, the Democrats believed in states' rights, opposed Republican policies, and focused on agrarian interests.


Step-by-step explanation:

After the Civil War, the Democrats believed in several key principles. First, they supported states' rights and limited federal government interference. They also opposed Republican policies such as Reconstruction and civil rights for African Americans. Additionally, Democrats focused on agrarian interests and economic development in the South.


Learn more about beliefs of Democrats after the Civil War

User Nathan Kamenar
by
7.9k points