Answer:
according to depts.washington.edu
World War I was to give women a chance to show a male-dominated society that they could do more than simply bring up children and stay at home. In World War I, women played a vital role in keeping soldiers equipped with ammunition and in many senses they kept the nation moving through their help in various industries.
Step-by-step explanation: