112k views
7 votes
What changes have taken place in the general

public's attitude with regard to war as a result of the World War I experience?

User Dileping
by
5.6k points

1 Answer

4 votes

Answer:

Anti-war sentiments

Step-by-step explanation:

World War 1 completely shattered everyone's expectations of what war was, with brutal trench warfare and terrible conditions throughout the entire war. War was seen as an adventure, and this sentiment remained until the American Civil War. People everywhere were horrified at World War 1, and books such as All Is Quiet On The Western Front were written about what soldiers had to endure daily.

User Aleks
by
6.1k points