Answer:
Women in WWI had bigger roles than ever before, they were now in charge of households and doing the money making when the men were gone. Women were overall had more duties and responsibilities. The experiences for women during WWI showed society what women can do for themselves, their families, for their counties, and for the world. Ever since then women have taken the world by storm doing more and more things and breaking barriers and now women can do anything. WWI was a defining moment for women in the US and perhaps all over the world.
Step-by-step explanation: