The "West" began as any land between the Appalachians and the Mississippi River. After the Louisiana Purchase in 1803, the West was now any of the new territory beyond the Mississippi and north of Spanish territory in the south (Mexico, Texas, etc). By the time of the Civil War, the United States' territory extended to the Pacific and most of the landmass was what we now recognize as the continental USA.