175k views
4 votes
What happened after the french colonization​

User BinaryGuy
by
6.7k points

1 Answer

4 votes

Answer: The North American possessions were lost to Britain and Spain but the latter returned Louisiana (New France) to France in 1800. The territory was then sold to the United States in 1803. France rebuilt a new empire mostly after 1850, concentrating chiefly in Africa as well as Indochina and the South Pacific

Step-by-step explanation:

User Muntasir Aonik
by
7.1k points