42.3k views
10 votes
WWI showcased he dark side of what?

User Mattshane
by
2.8k points

1 Answer

9 votes

Answer:

World War I was the deadliest conflict until that point in human history, claiming tens of millions of casualties on all sides.

The experience of World War I had a major impact on US domestic politics, culture, and society. Women achieved the right to vote, while other groups of American citizens were subject to systematic repression.

User Dlu
by
3.0k points