5.6k views
20 votes
Why does the government need to sell lands in the west?

User Scooter
by
6.8k points

1 Answer

7 votes

Answer:

Down below

Step-by-step explanation:

In U.S History, the government was giving land in the west to anyone who would live and work out there so they could expand and fulfill manifest destiny. Manifest destiny is the idea that god intended for settlers to expand and spread democracy and capitalism across the North American continent.

Hope this helps!

User Gerunn
by
5.5k points