118k views
4 votes
What lasting political changes were brought about by World War 1? PLZ ANSWER RIGHT NOW!!

User Uliwitness
by
5.8k points

1 Answer

5 votes
The Great Migrations brought African Americans and Mexican Americans to northern cities, women had more rights (most of which vanished when the war ended), education improved, people were more disciplined, the economy was booming (only for a little bit-- then the Great Depression came), and anti-German feelings were abundant among Americans.
User Farhana
by
6.1k points