115k views
2 votes
Which statement is true about British relationships after the war?

User MWillemse
by
5.6k points

2 Answers

2 votes

The French and Indian War changed the relationship between the British and the colonists. Once the war ended, a series of laws were passed to deal with changes brought about by this war. The colonists wanted to go west to settle in the land Britain got from France.



hope this helped

User Saleh Mahmood
by
5.3k points
3 votes

Answer: they had strained relationships with both American Indians and colonists.

Step-by-step explanation:

User Jan Rieke
by
5.1k points