112k views
2 votes
What first drew Americans out the west

2 Answers

0 votes
The belief that settlers were destined to expand to the west is often referred to as Manifest Destiny
User Fractalism
by
8.0k points
3 votes
they thought god was leading them to the west
User Andy Donegan
by
8.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.