22.5k views
0 votes
How did the United States role change in the early 1800s?

User Reckless
by
7.5k points

1 Answer

7 votes
During the 1800s, the United States gained much more land in the West and began to become industrialized. In 1861, several states in the South left the United States to start a new country called the Confederate States of America. This caused the American Civil War. After the war, Immigration from Europe resumed. Some Americans became very rich in this Gilded Age and the country developed one of the largest economies in the world.

User Lonna
by
7.2k points