173k views
16 votes
PLEASE HELP ILL GIVE BRTAIN;LIESTT

Do you think it was divine right that America
expand westward?

User SimUser
by
3.5k points

2 Answers

3 votes

Answer:

manifest destiny the political doctrine or belief held by the united states particularly during it's expansion that the nation had a god-given right to expand towards the west

and yes it was obvious,invitable, and it was a divine right of united states

Step-by-step explanation:

User Sukhi
by
3.6k points
3 votes

Answer:

I dont think it was divine

Step-by-step explanation:

First i dont believe in that stuff personally i am an atheist but think about how many ppl want to explore new places. It was prolly the same back then, they were curious what was out there and started to explore. That’s what i think.

Hope this helps

User Shital
by
3.3k points