18.4k views
1 vote
15 PTS How did the United States entry into World War I change the culture and international political policies of the country?

1 Answer

3 votes

Answer:

World War I was the "war to end all wars." It had major consequences on Americans both at home and abroad.

Step-by-step explanation:

User Joel Jones
by
6.2k points