105k views
3 votes
How did the United States change as it expanded westward?

1 Answer

11 votes

Answer:

The quest for Western influence contributed to the conquest of Texas and the Mexican-American War in the mid-19th century. This extension led to disputes over the fate of Western slavery, raising tensions between the North and South, leading eventually to the fall of American independence and a bloody civil war.

User Jigal Van Hemert
by
4.2k points