Because at first they fought against the Americans by fighting with Britain and France. But when the colonies won their independence and became the United States they Indians were betrayed. They tried to make peace with the colonies but eventually even those with treaties were moved far away from their native lands and promised that where they were taken would be their land forever. Basically the United States government stabbed the Native Americans in the back at every single promise. And sadly genocide was a big part of it as well.