Answer:
Westward expansion profoundly changed American society. As the nation grew, more people looked west in order to obtain cheaper land. ... This benefited the West as it made transportation of goods and people easier; it also benefited Americans as a whole, as American manufacturing increased and new jobs were being created.
Step-by-step explanation: