172k views
3 votes
1. When the United States first became a nation, what did the West mean?

User Hoang
by
4.8k points

1 Answer

1 vote

Answer: West was an area where people would say was an area of peace and prosperity.

Explanation: The West was not populated at first and there was no government there.

User Wingblade
by
5.0k points