191k views
4 votes
Some westward expansion was inevitable by the United States, especially after

User Wason
by
7.6k points

1 Answer

2 votes
Some westward expansion was inevitable by the United States, especially after the "Louisiana Purchase", and after people embraced the idea of Manifest Destiny, which claimed that the US was "destined" to expand from the Atlantic to the Pacific. 
User Rich Maes
by
7.8k points