Answer:
Women's roles continued to expand in the postwar era.
Step-by-step explanation:
Women who remained in the workplace were usually demoted. But after their selfless efforts during World War II, men could no longer claim superiority over women. Women had enjoyed and even thrived on a taste of financial and personal freedom—and many wanted more.