127k views
4 votes
In what year did the U.S. enter World War I?

User Romanoti
by
7.2k points

1 Answer

5 votes

Final answer:

The United States entered World War I in 1917 and played a significant role in helping the Allied forces secure victory.


Step-by-step explanation:

In what year did the U.S. enter World War I?

The United States entered World War I in 1917. Until then, the U.S. had maintained a policy of neutrality, but factors such as the sinking of the RMS Lusitania by a German submarine and the Zimmermann Telegram, which proposed a military alliance between Germany and Mexico, led to the U.S. joining the war.

Once the U.S. entered the war, it played a significant role in helping the Allied forces secure victory. The American Expeditionary Forces fought alongside British, French, and other Allied troops.


Learn more about U.S. entry into World War I

User Pavlo Naumenko
by
8.3k points