143k views
0 votes
How did the definition of west change in the years 1800 to 1860?

1 Answer

0 votes
The "West" began as any land between the Appalachians and the Mississippi River. After the Louisiana Purchase in 1803, the West was now any of the new territory beyond the Mississippi and north of Spanish territory in the south (Mexico, Texas, etc). By the time of the Civil War, the United States' territory extended to the Pacific and most of the landmass was what we now recognize as the continental USA.
User Ifyouseewendy
by
8.0k points