I think the "west" here refers to the West of the US.
First, exploration brought some knowledge of the west of the US,
Then, it lead to the disappearance of the Indigenous cultures in the west of today's US
Then, it lead to Europeans and Americans of European descend to settle in those areas.