58.8k views
5 votes
How did the war change american society

User Adhocgeek
by
5.2k points

2 Answers

6 votes

Final answer:

The war changed American society by providing new opportunities and freedoms for women and minority groups, leading to a push for greater equality and civil rights.

Step-by-step explanation:

The war changed American society in numerous ways. For women, African Americans, and other minority groups, the war provided new opportunities and freedoms. Women entered the workforce in greater numbers, taking on jobs previously held by men. African Americans also found employment in new areas, breaking barriers in the workplace. These changes were not easily erased after the war, and they contributed to a push for greater equality and civil rights.

User Jan Schaefer
by
5.7k points
0 votes
World war I changed america and transformed its role in international relations Mexico to attack the united states if it did not remain neutral Americans were ready to fight.
User Buena
by
5.1k points