17.9k views
2 votes
What do you think the West came to symbolize in American culture?

1 Answer

3 votes
The west symbolized the growth populations of America.
User Tawan
by
8.4k points

No related questions found