75.5k views
2 votes
Where did the british agreed to recognize the united states as an independent nation after the america

User Anesha
by
8.4k points

2 Answers

2 votes
After the American Revolution, America won the war. So, the English agreed to recognize the United States as independent.
User Renm
by
7.8k points
3 votes
some time after the war of 1812
User Fonfonx
by
8.5k points

No related questions found