199k views
7 votes
In 1800 what region was west of the United States

1 Answer

12 votes

Answer:

Step-by-step explanation:

As American settlement in the U.S. expanded westward, the meaning of the term the West changed. Before about 1800, the crest of the Appalachian Mountains was seen as the western frontier. The frontier moved westward and eventually the lands west of the Mississippi River were considered the West.

User Sachin Trivedi
by
6.3k points