110k views
3 votes
The colonies that became the original United States were part of which European nation's land claims

1 Answer

5 votes
England. The land was under English control but the land itself was often owned my individuals that the king owed money to,proprieters(business men) and the king.
User Tilleryj
by
6.2k points