The U.S. didn't directly play a part in World War 1. All though, America traded with nations actively involved within the war. Until things took a turn when the U.S. declared war against Germany.
Hope I could help! :)
6.5m questions
8.7m answers