118k views
4 votes
What are some of the positive things that happened in America as a result of WWI? What new

opportunities
did people have?

What are some of the positive things that happened in America as a result of WWI? What-example-1

1 Answer

0 votes

Answer:

Step-by-step explanation:

The experience of World War I had a major impact on US domestic politics, culture, and society. Women achieved the right to vote, while other groups of American citizens were subject to systematic repression.

User Lyonel
by
4.7k points