126k views
0 votes
Which U.S. territory in the West become

a state in 1850?
A. Maine
B. Ohio
C. California

1 Answer

3 votes

Final answer:

California became a state in 1850.


Step-by-step explanation:

The U.S. territory in the West that became a state in 1850 was California.


Learn more about California statehood

User Basirat
by
7.1k points