94.7k views
1 vote
What was the role of America after WWI?

User Zaynyatyi
by
8.1k points

1 Answer

6 votes

Answer:

Under President Woodrow Wilson, the United States remained neutral until 1917 and then entered the war on the side of the Allied powers (the United Kingdom, France, and Russia). The experience of World War I had a major impact on US domestic politics, culture, and society.

Step-by-step explanation:

User Jonsidnell
by
9.4k points

No related questions found