194k views
1 vote
What did the Treaty of Paris do for the Americans colonies in 1783?

2 Answers

10 votes

Answer:

The treaty of Paris was signed by people that were U.S. and British representatives on September 3rd, 1783. This ended the war of the American Revolution. This also gave the U.S. some western territory.

User Cuizizhe
by
4.1k points
4 votes

Answer:

The Treaty of Paris was signed by U.S. and British Representatives on September 3, 1783, ending the War of the American Revolution. Based on a1782 preliminary treaty, the agreement recognized U.S. independence and granted the U.S. significant western territory.

Step-by-step explanation:

hope this help:)

User Expedito
by
4.6k points