187k views
4 votes
From the following selection of U.S. states, select those that are Western states. Alaska Arizona Hawaii Iowa Kentucky Montana Wyoming

2 Answers

6 votes
Alaska, Arizona, Hawaii, Montana, and Wyoming
User Mouffette
by
8.3k points
3 votes

Answer:

Arizona, Montana, and Wyoming

Step-by-step explanation:

The American West is the region of the United States that includes the states west of the Mississippi River. This definition has varied over time due to the territorial expansion of the United States as a nation into the Pacific Ocean.

The states included would be as follows:

  • Arizona
  • California
  • Colorado
  • Idaho
  • Montana
  • New Mexico
  • Nevada
  • Oregon
  • Utah
  • Washington
  • Wyoming
User Steve Buzonas
by
7.8k points