163k views
1 vote
How did America entering world war 1 change America?

User Riley Finn
by
5.8k points

1 Answer

9 votes

Answer:

The experience of America entering World War I had a major impact on U.S. domestic politics, culture, and society. For example, women worked in factories and replaced men's roles in society, achieving the right to vote, while other groups of American citizens were subject to systematic repression.

Step-by-step explanation:

I hope this helps!

User Sheldonhull
by
6.8k points