214k views
5 votes
What do you think the West came to symbolize in American culture?

User JRulle
by
8.3k points

2 Answers

3 votes
The West symbolized the growth of America. With the discoveries of gold in the West, it led to more wealth and industrialization.The American spirit is always moving and pushing onward.
User KendallB
by
6.8k points
3 votes
Well America didn't have many people in the west. The cultures were inherited from the Native Indians that lived there. The change from British and English, any European country, was major. More freedom was shown, and the possibility to extend your country. Idk if that makes sense.
User Evan Stoddard
by
8.1k points

No related questions found