17.9k views
2 votes
What do you think the West came to symbolize in American culture?

1 Answer

3 votes
The west symbolized the growth populations of America.
User Tawan
by
8.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.