350,979 views
18 votes
18 votes
How did America entering world war 1 change America?

User Orvin
by
2.5k points

1 Answer

13 votes
13 votes

Answer:

The experience of America entering World War I had a major impact on U.S. domestic politics, culture, and society. For example, women worked in factories and replaced men's roles in society, achieving the right to vote, while other groups of American citizens were subject to systematic repression.

Step-by-step explanation:

I hope this helps!

User Eligio
by
3.2k points