142k views
0 votes
How did world war 1 change American attitudes and opinions about the future?

User Moshe Katz
by
5.5k points

1 Answer

6 votes

World War I certainly influenced the opinion of Americans about many aspects of the future. The U.S for instance begun to isolate itself from the world and became increasingly unwilling to interfere in any external disputes. However, it also accelerated economic growth and changed attitudes on issues like the rights of women.

User Bendl
by
5.5k points