200k views
3 votes
What were some of the effects of the explorations of the west in the 1800s?

1 Answer

4 votes
I think the "west" here refers to the West of the US.

First, exploration brought some knowledge of the west of the US,
Then, it lead to the disappearance of the Indigenous cultures in the west of today's US

Then, it lead to Europeans and Americans of European descend to settle in those areas.
User Orquesta
by
6.5k points