82.3k views
23 votes
How did the Civil War change the landscape of the U.S. politically and socially?

[full answer and explanation needed]

1 Answer

6 votes

ANSWER:

How did the end of the war change the political landscape of the United States?

It shifted the political balance of power from the South to the North. It expanded the power and scope of the federal government. It greatly expanded the federal budget and transformed the government into the nation's largest employer .

User Josh Buedel
by
3.7k points