26.2k views
3 votes
The colonies that became the original United States were part of which European nations land claims

1 Answer

6 votes
The colonies that became the original United States were part of England, that is, Great Britain.
User Mgadda
by
8.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.