217k views
0 votes
How did the War Between the States
redefine America?

User Holgerm
by
4.6k points

1 Answer

3 votes

Answer: The war redefined many aspects of American life. It led to the abolition of slavery, but not to true freedom for all former slaves and their descendants. It increased the power of the central government as compared to the states, as each and every war has done, step by step.

Step-by-step explanation:

User Trevor Pilley
by
5.1k points