52.7k views
3 votes
How might those attitudes have changed as the war went on for several years?

im in 8th grade

User DanEEStar
by
4.0k points

1 Answer

3 votes

Answer:

Step-by-step explanation:

World War I caused Americans to be more isolationist and pacifist. WWI soured Americans on foreign affairs. They felt that they had been pulled into a war that was not really important to US interests. Therefore, they hoped to remain isolated from foreign affairs, except when they took actions to try to prevent war.

User Jin Kwon
by
4.6k points