4.3k views
2 votes
What happened in the United States after the Europeans colonized (or took) the land from the Native Americans? That is, after taking the land from the Native Americans, the Europeans (now called Americans) had millions of acres of land.

User Smarteist
by
3.7k points

1 Answer

4 votes

Answer:

Colonization ruptured many ecosystems, bringing in new organisms while eliminating others. The Europeans brought many diseases with them, which decimated Native American populations. Colonists and Native Americans alike looked to new plants as possible medicinal resources.

hope this helped

User JMabee
by
3.2k points