420,776 views
22 votes
22 votes
The colonies formed the United States of America after breaking away from Great Britain.

True
False

User JHowIX
by
3.0k points

1 Answer

16 votes
16 votes
I’m pretty sure that is false. The first colony was made in 1600s. But USA became independent in 1776
User N Chauhan
by
2.6k points