36.6k views
2 votes
Explain the role of the United States in WWI socially, economically and politically.

1 Answer

5 votes

Final answer:

The United States played a significant role in World War I socially, economically, and politically.

Step-by-step explanation:

The United States played a significant role in World War I socially, economically, and politically. Socially, the war led to major changes in American society, including the Great Migration of African Americans from the South to Northern cities and a shift in women's roles as they entered the workforce to take on jobs traditionally held by men. Economically, the war boosted the American economy, as the United States became a major supplier of goods to the Allied powers. Politically, President Woodrow Wilson's vision for a postwar world of peace and collective security, as articulated in his Fourteen Points, was a driving force behind U.S. involvement in the war and the subsequent negotiations for the Treaty of Versailles.

User Petr Broz
by
8.0k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.