110,045 views
12 votes
12 votes
When did the US finally enter World War I

User Lee Englestone
by
3.4k points

2 Answers

8 votes
8 votes

Final answer:

The United States officially entered World War I in April 1917, with President Woodrow Wilson declaring war on Germany on April 2nd. The U.S. provided critical support to the exhausted Allied forces by sending troops and resources. This marked an end to American isolationism and began its more active involvement in international affairs.

Step-by-step explanation:

U.S. Entry into World War I

The United States entered World War I in April 1917, with President Woodrow Wilson asking Congress to declare war on Germany on April 2, 1917. The entry was a pivotal moment in the war, as the Allies were nearing exhaustion and in desperate need of support. Following years of isolationism and neutrality under the presidency of Woodrow Wilson, the sinking of American ships by German submarines and other escalating tensions finally led to U.S. involvement.

The Selective Service Act was passed a month later, resulting in thousands of American men being drafted into military service. This marked a significant shift from the United States' previous stance of isolation, and by June 1917, approximately 200,000 American troops were sent overseas to aid the Allied forces.

Although the United States fought as a segregated force, their participation was crucial. The American Expeditionary Force contributed greatly to the eventual defeat of the German forces on the western front. Despite the briefness of the American role, their involvement was instrumental in the Allies' defense and helped to shape the outcome of World War I. America's entry into the war ultimately symbolized a break from its isolationist past, engaging more actively in global affairs, a trend that continued well after the end of World War II.

User Michael Fulton
by
3.6k points
8 votes
8 votes

Answer:

April 2, 1917,

Step-by-step explanation:

User AFoglia
by
3.8k points