35,403 views
3 votes
3 votes
What did settlers bring to america indian lands in the west?

User Odaliz
by
3.1k points

2 Answers

16 votes
16 votes

Answer:

Although the discovery and settling of these new lands, this new continent, brought many positive things to Europe, such as new crops, like potatoes, that saved European countries from famine at some points, to America, it also brought the introduction of diseases for which the American Indians from South and North

Step-by-step explanation:

User Vitamin
by
3.2k points
11 votes
11 votes

Answer:

Although the discovery and settling of these new lands, this new continent, brought many positive things to Europe, such as new crops, like potatoes, that saved European countries from famine at some points, to America, it also brought the introduction of diseases for which the American Indians from South and North.

User Daragua
by
3.1k points