122k views
3 votes
What is Manifest Destiny? Why do Americans feel they have the right to practice it?

User Jerard
by
7.7k points

1 Answer

2 votes
Manifest Destiny is the belief that the U.S had the right to expand its territory by the grace of god. All you need to know in a nutshell to be honest.
User Jason Bert
by
7.8k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.