25.6k views
1 vote
What do you think the West came to symbolize in American culture?

User Fogus
by
7.9k points

1 Answer

6 votes
the west came with many things, such as modern infrastructure along the coast, and in that came a big profit to America.
User Danieltahara
by
8.9k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.