38.5k views
2 votes
What role did the U.S. played during world war 1?

User Jacinta
by
5.3k points

1 Answer

3 votes

The U.S. didn't directly play a part in World War 1. All though, America traded with nations actively involved within the war. Until things took a turn when the U.S. declared war against Germany.

Hope I could help! :)

User Antken
by
5.6k points