51.8k views
0 votes
The colonies formed the United States of America after breaking away from Great Britain.

True
False

1 Answer

6 votes
I’m pretty sure that is false. The first colony was made in 1600s. But USA became independent in 1776
User Fabienne
by
7.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.