143k views
2 votes
What are some of the positive things that happened in America as a result of WWI? What new

opportunities
did people have?

User Puczo
by
4.2k points

1 Answer

6 votes
Women’s rights was something positive born out of WWI, women were needed to do more than stay at home and were asked to apply to factory jobs due to the boom in weapons, loss in men due to the enlistment, and the economy being in need of support.
User Fikret
by
3.8k points