152k views
3 votes
what do you think were some lasting effects of the American civil war? do you think the country is generally better off because of it?​

1 Answer

5 votes

Answer:

Some long-term effects that occurred after the Civil War were the abolishment of slavery, the formation of blacks' rights, industrialization and new innovations. The Northern states were not reliant on plantations and farms; instead they were reliant on industr

User Ilija Dimov
by
6.2k points