56.6k views
1 vote
Why did the United States take on colonies of its own?

(Please help and please be accurate! Thank you!:) )

1 Answer

6 votes

Answer:

For generations of Americans, land symbolized opportunity, beginning with colonists who never owned property in Europe, and whose eyes were captivated by the spacious continent.

User Ken Toh
by
4.5k points