Answer:
Yes. Up until the Southern states seceded and formed a Confederacy, the Civil War was not inevitable. ... The Union realized that holding the South to the abolition of slavery was consistent with what the Southern states had agreed to when they joined the United States of America.
Step-by-step explanation: