Answer:
Spain and France established colonies in America for the same reasons that England did (mostly for wealth and prestige). France and Spain established their colonies before England as well. France established colonies in North America as a way to support their trading with the Native Americans.
Step-by-step explanation:
I hope this helps :)