17.0k views
5 votes
How important do you think women’s experiences in world war 1 in relation to the changing roles of women?

User Ztatic
by
5.2k points

2 Answers

4 votes

Answer:

Women in WWI had bigger roles than ever before, they were now in charge of households and doing the money making when the men were gone. Women were overall had more duties and responsibilities. The experiences for women during WWI showed society what women can do for themselves, their families, for their counties, and for the world. Ever since then women have taken the world by storm doing more and more things and breaking barriers and now women can do anything. WWI was a defining moment for women in the US and perhaps all over the world.

Step-by-step explanation:

User Jabba
by
5.8k points
3 votes

Answer:

Very important

Step-by-step explanation:

The only thing that they did was cook and clean for the house and take care of the kids. That isn't a very bad thing BUT they didn't have a chance to get a say in world war 1. Now in America its ok to have a woman's opinion in big topics such as this. Women can be very intelligent so, can men but don't you think more powerful brains are better than only a couple?

User JohnOpincar
by
5.9k points