Answer:
Manifest Destiny, a phrase coined in 1845, is the idea that the United States is destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent. The philosophy describing the necessary expansion of the nation westward was called Manifest Destiny; the belief that it was our duty to settle the continent, conquer and prosper.
Step-by-step explanation:
if this isn't right I'm so sorry T^T