119k views
3 votes
According to the declaration of independence who has taken away the rights of americans

User Danziger
by
7.7k points

1 Answer

3 votes
Great Britain was the one who did not grant the Colonies freedom until the Revolutionary War.
User Kabilesh
by
7.4k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.