51.8k views
0 votes
The colonies formed the United States of America after breaking away from Great Britain.

True
False

1 Answer

6 votes
I’m pretty sure that is false. The first colony was made in 1600s. But USA became independent in 1776
User Fabienne
by
3.3k points