16.0k views
1 vote
How did World War One led to significant changes in society

User Hanggi
by
8.3k points

1 Answer

4 votes

World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities. The American public felt a great sense of nationalism and patriotism during the war as they were unified against a foreign threat. However, it also led to constant scrutiny and racial prejudice against minority groups such as the South Eastern Europeans.

User Mafue
by
8.1k points