33.5k views
2 votes
How did war world war one change women’s roles in United States

How did war world war one change women’s roles in United States-example-1

2 Answers

1 vote

woman had to do the jobs of the men that went away to the war.

User Mchid
by
5.3k points
1 vote

Answer:

They had to adapt to perform the jobs of the men that went away to the war.

Step-by-step explanation:

When the men left for war, many businesses had no one working for them. Women learned the jobs requirements fairly quickly if they didn't already know how to do it, and they took over.

User Amit Maniar
by
5.4k points