118k views
2 votes
How did the concept of manifest destiny encourage americans to move west.

1 Answer

4 votes

Answer: Manifest Destiny, a phrase coined in 1845, is the idea that the United States is destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent.

Step-by-step explanation:

User Alia Anis
by
8.1k points