103k views
0 votes
Why did America become isolationist after WW1? HELP!!

1 Answer

1 vote

Answer:

World War I contributed to pushing American public opinion and policy toward isolationism.

Step-by-step explanation:

User Sebastian Zeki
by
7.0k points