menu
QAmmunity.org
Login
Register
My account
Edit my Profile
Private messages
My favorites
Ask a Question
Questions
Unanswered
Tags
Categories
Ask a Question
Where did the british agreed to recognize the united states as an independent nation after the america
asked
Apr 17, 2017
75.5k
views
2
votes
Where did the british agreed to recognize the united states as an independent nation after the america
History
high-school
Anesha
asked
by
Anesha
8.4k
points
answer
comment
share this
share
0 Comments
Please
log in
or
register
to add a comment.
Please
log in
or
register
to answer this question.
2
Answers
2
votes
After the American Revolution, America won the war. So, the English agreed to recognize the United States as independent.
Renm
answered
Apr 18, 2017
by
Renm
7.8k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
3
votes
some time after the war of 1812
Fonfonx
answered
Apr 22, 2017
by
Fonfonx
8.5k
points
ask related question
comment
share this
0 Comments
Please
log in
or
register
to add a comment.
← Prev Question
Next Question →
No related questions found
Ask a Question
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.
9.5m
questions
12.2m
answers
Other Questions
What goal of the constitution was also a goal of the Magna Carta?
is it true or false that after the american revolution conflicts in the northwest territory erupted between remaining british soldiers and native americans
What were the positive and negative effects of Egypt being imperialized by Britain.
Why might things far away and long ago be important to us now.
What military strategy defeated Cornwallis at Yorktown?
Twitter
WhatsApp
Facebook
Reddit
LinkedIn
Email
Link Copied!
Copy
Search QAmmunity.org