220k views
4 votes
Do you think the United States had a right to settle western North America ? Explain .

User JustAPup
by
4.4k points

1 Answer

5 votes

Answer:

No

Step-by-step explanation:

I would've said yes if they came peacefully and shared the land with the peaceful Native Americans, but all they did was mistreat the people already living there and spread a bunch of disease to them. I think it was inhumane to fight a war against the Natives on their own land and force them onto tiny reservations where they still live on today. Many of the Natives died from all the mistreatment. It was cruel and greedy.

User Zdd
by
4.8k points