23.2k views
1 vote
How did World War 1 change the US?

User Vahdet
by
7.6k points

1 Answer

2 votes

Answer: The experience of World War I had a major impact on US domestic politics, culture, and society

User Thealon
by
8.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.