112k views
2 votes
What first drew Americans out the west

2 Answers

0 votes
The belief that settlers were destined to expand to the west is often referred to as Manifest Destiny
User Fractalism
by
4.8k points
3 votes
they thought god was leading them to the west
User Andy Donegan
by
5.0k points