Final answer:
Gender roles in American society before WWII were focused on domestic roles for women. However, during the war, more women entered the workforce. After the war, while some women continued to work, societal expectations for women started to shift.
Step-by-step explanation:
Gender roles in American society before WWII were primarily based on the notion that women should focus on domestic roles, while men held jobs outside of the home. However, during WWII, with many men away at war, women entered the workforce in greater numbers and took on roles that were traditionally held by men. After the war, while some women continued to work outside of the home, many returned to their previous domestic roles, although societal expectations and gender roles had started to change.