132k views
5 votes
What political effects did World War I have on the United
States?

User Syam
by
5.6k points

1 Answer

5 votes

Answer:

The experience of World War I had a major impact on US domestic politics, culture, and society. Women achieved the right to vote, while other groups of American citizens were subject to systematic repression.

Step-by-step explanation:

we learn it in school?? Especially the part with women's rights.

User Chthonic Project
by
6.2k points