Answer:
Before WWI, women had little to no rights. As the war came around, women gained several rights. They were able to now find a job. Women went to work in factories, many of which produced ammunitions and armed forces materials. Some women even disguised themselves enough to look like men, and enlisted in the Army. Women became medics during the war effort as well. Overall, women kept America's economy flowing while the men were shipped overseas.
Step-by-step explanation: