80.6k views
5 votes
How did the treaty of paris change the united states?

1 Answer

2 votes

Answer:

The Treaty of Paris was signed by U.S. and British Representatives on September 3, 1783, ending the American Revolutionary War. Based on a 1782 preliminary treaty, the agreement recognized U.S. independence and granted the U.S. significant western territory.

Step-by-step explanation:

User Heah
by
3.8k points