Final answer:
The United States entered World War I in 1917.
Step-by-step explanation:
The United States entered World War I in 1917. This war began in 1914 and the U.S. remained neutral for the first few years. However, they eventually joined the conflict on the side of the Allies in response to Germany's unrestricted submarine warfare and the discovery of the Zimmerman Telegram. The U.S. played a significant role in the war and helped the Allies achieve victory in 1918.
Learn more about United States entering World War I