Answer:
World War I had a profound impact on life in the United States, both at home and abroad. At home, the war gave rise to a new wave of patriotism and nationalistic pride, as well as a heightened sense of social and economic change. Many women, African Americans, and other marginalized groups joined the war effort, both in the military and in the workforce. This led to a greater sense of independence and self-determination among these groups, as well as a greater recognition of their rights and contributions to society. Additionally, the war led to an economic boom, as the country was able to produce more goods and services to meet the demands of the war effort. This led to increased wages, improved working conditions, and more job opportunities for many people.
Step-by-step explanation: