144k views
1 vote
Identify the role of women in American society before, during, and after World War I.

User Elmarco
by
7.9k points

1 Answer

3 votes

Answer:

Before the war: Were not as commonly found in the workforce

During the War: Grew victory gardens to support the war effort and Worked as nurses near the front lines

After the War: granted voting rights throughout the united states and had greater pride and confidence in their abilities.

Step-by-step explanation:

Before war: "Were not as commonly found in the workforce"

During the War: Grew "victory gardens" to support the war effort" and Worked as nurses nea the front lines

Women are the primary caregivers of children and elders in every country of the world.

User Fabian Hertwig
by
8.3k points

No related questions found