11.5k views
0 votes
1. What role did "Manifest Destiny" play in getting people to move west?

User Ondrobaco
by
5.7k points

1 Answer

6 votes

Answer:

Manifest Destiny was a popular belief in the mid-to-late 19th century. Its proponents claimed that the United States had the divine right to expand westward—meaning that U.S. expansion was the will of God.

Step-by-step explanation:

User Deinocheirus
by
5.4k points