25.6k views
1 vote
What do you think the West came to symbolize in American culture?

User Fogus
by
7.9k points

1 Answer

6 votes
the west came with many things, such as modern infrastructure along the coast, and in that came a big profit to America.
User Danieltahara
by
9.2k points