220k views
2 votes
How did World War I change the US??

User Spaudanjo
by
7.4k points

1 Answer

5 votes

Answer:

The war fueled the vast diaspora of African Americans, and those who returned from the war, seeing discrimination still in place, sought civil rights. Furthermore, the war signaled the advent of conscription, mass propaganda, the national security state, and the FBI.

Step-by-step explanation:

User Towkir
by
7.2k points