111k views
5 votes
the colonists that became the original United States were part of which European nations land claims?

User Imrul
by
8.0k points

1 Answer

7 votes
The united states were originally British colonists. Under the authority of Great Britain.
User Bocercus
by
8.4k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.