72.7k views
5 votes
What does the term manifest destiny imply?

User Mistwalker
by
7.1k points

2 Answers

0 votes
The belief or doctrine, held chiefly in the middle and latter part of the 19th century, that it was the destiny of the U.S. to expand its territory over the whole of North America.
User Tommaso Barbugli
by
7.8k points
6 votes
Manifest Destiny was the idea that the people of America had a mission from God to conquer the continent form East to West. In all reality, it was a word used to rationalize the American peoples hunger for land and the westward expansion. They also used this term to rationalize the 'westernization' of the indigenous Indians they discovered as they traveled.
User Ali Helmy
by
7.8k points