184k views
3 votes
In the 1800's many americans believed in manifest destiny. what does that term mean?

2 Answers

5 votes
Manifest destiny was what Americans thought was God's will for them to expand and conquer the land to the west.
User Erykah
by
6.9k points
3 votes
That it was obvious that God wanted the USA to expand and take over the whole of the North American continent.
User Bulent
by
7.5k points