The war greatly changed the role of women in America. Before the war, women were restricted to a domestic life. This means that women were discouraged from participating in public life, and were mostly concerned with family matters. However, during the war, women had to take on roles that they had never attempted before. Women joined the workforce in large numbers, and they often took the role of providers for their families while the men were fighting in the war.
After the war, many women did not want to go back to their domestic life. Feminism gained traction and women achieved more visibility in the public sphere. They also gained more legal rights and a new role in American society.