Answer:
Step-by-step explanation:
World War I caused Americans to be more isolationist and pacifist. WWI soured Americans on foreign affairs. They felt that they had been pulled into a war that was not really important to US interests. Therefore, they hoped to remain isolated from foreign affairs, except when they took actions to try to prevent war.