165,585 views
11 votes
11 votes
How did the Civil War change the landscape of the U.S. politically and socially?

[full answer and explanation needed]

User Brad Parks
by
2.6k points

1 Answer

15 votes
15 votes

ANSWER:

How did the end of the war change the political landscape of the United States?

It shifted the political balance of power from the South to the North. It expanded the power and scope of the federal government. It greatly expanded the federal budget and transformed the government into the nation's largest employer .

User Gauss
by
2.2k points