93.1k views
5 votes
How did American politics change after Reconstruction?

1 Answer

1 vote
The Reconstruction era redefined U.S. citizenship and expanded the franchise, changed the relationship between the federal government and the governments of the states, and highlighted the differences between political and economic democracy.
User MattUebel
by
7.8k points

No related questions found