60.9k views
5 votes
The belief,shared by many Americans,that the United States had a right to own all the land from the Atlantic to the Pacific was called..

User Boston
by
8.0k points

1 Answer

2 votes
It was called Manifest Destiny from what I remember. Hope that helps!
User Leonardo Physh
by
8.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.