The Alabama Statehood of 1819 made Alabama finally become a state of the United States of America. This was a period where the Indian wars ended and most Native Americans were forced out of the state. White settlers arrived in the state in large numbers bringing in or importing African slaves that are sold and traded in the domestic market.