Final answer:
World War I expanded employment opportunities for women in the United States and led to them gaining the right to vote. The war also bolstered women's confidence by proving they could perform 'men's work,' thereby challenging traditional gender roles.
Step-by-step explanation:
World War I significantly impacted women in the United States in several ways. One of the main effects was the expansion of employment opportunities for women as they entered the workforce in unprecedented numbers. Women took on jobs in munitions factories, the civil service, banks, and offices, many of which had traditionally been reserved for men. This not only gave women greater financial independence but also enhanced their societal status.
Additionally, immediately following the end of the war, women achieved a major victory in the fight for suffrage, gaining the right to vote as a recognition of their contributions to the war effort. This was a significant step towards gender equality in America and reinforced the notion that women were capable of the same responsibilities as men.
The war also fostered a sense of greater confidence among women as they proved they were able to perform jobs that were typically considered ‘men’s work.’ This shift in societal perceptions played a crucial role in changing gender roles in the United States.