26.2k views
3 votes
The colonies that became the original United States were part of which European nations land claims

1 Answer

6 votes
The colonies that became the original United States were part of England, that is, Great Britain.
User Mgadda
by
8.3k points