19.9k views
1 vote
What were the effects of World War I on the United States?

User Michal S
by
4.5k points

1 Answer

7 votes

Answer: On the political front, Americans sought to expand their role in world affairs. World War I also led to the rise of the 'Lost Generation'. This was a generation that had become disillusioned with the ideals and values of American consumer culture and political democracy.

Step-by-step explanation:

User Miguel Febres
by
5.2k points