Answer:
The Manifest Destiny is the idea that the United States is destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent.
With this belief it encouraged many Americans to spread west word increasing growth and spreading cultures
Step-by-step explanation: