Final answer:
The period following World War I saw significant social changes, including increased civil rights for African Americans, new roles for women in the workforce, and greater social mobility due to technological advancements.
Step-by-step explanation:
After World War I, several social changes occurred that reshaped societies. Notably, there were significant shifts pertaining to civil rights for African Americans, increased mobility through technology, new roles for women, and a general break with traditional norms. Women took on roles outside the traditional domestic sphere, entering the workforce in unprecedented numbers. This change was not limited to any single class or group of women; it extended to those who had never worked before and those changing from lower-paid positions to better-paying roles. Furthermore, the war presented opportunities for African Americans to move North during the Great Migration in search of better jobs, which continued post-war.
The Civil Rights Movement was given new momentum thanks to the war, as organizations such as the NAACP and others continued to fight for equality and against segregation. These social changes were significant, but it's important to note that many advances were met with resistance and didn't always result in lasting change. However, the World War I era did mark the beginning of shifting attitudes and provided a foundation for future advancements in civil rights and gender equality.