Answer:
Natives had been pushed continually West as whites pursued more land for crops and as the American population as a whole expanded. Natives contracted diseases from whites. Also, many Americans tried to force what they perceived as "civilization" (Christianity, English language, Capitalism, agriculture, formal education, decreased importance of women, American dress, etc.) upon Natives.
Step-by-step explanation: