122k views
3 votes
What is Manifest Destiny? Why do Americans feel they have the right to practice it?

User Jerard
by
7.1k points

1 Answer

2 votes
Manifest Destiny is the belief that the U.S had the right to expand its territory by the grace of god. All you need to know in a nutshell to be honest.
User Jason Bert
by
7.3k points