Answer:
When the war ended with an Allied victory, the wartime efforts on the home front and foreign battlefields had caused permanent changes to American society and culture. Because of World War II, we saw the roots of the Civil Rights Movement, the Women’s Rights Movement, widespread college education, and health insurance benefits.
Step-by-step explanation: