224k views
4 votes
which important document announce that the American colonies no longer wish to be part of the British Empire

2 Answers

1 vote

Answer:

D

Step-by-step explanation:

User Keleigh
by
7.8k points
4 votes
The Declaration of Independence announced that the American colonies no longer wish to be part of the British Empire.
User Gin
by
8.1k points

No related questions found