164k views
2 votes
Why do Americans have the right to take more land in North America?

User JMorgan
by
5.5k points

1 Answer

5 votes

Explanation They have the right because their founding fathers discovered the land and claimed it as theirs so its now theirs to keep and to do whatever with it.

User Ritesh Bhavsar
by
6.0k points