169k views
3 votes
The United States changed after WWI by making a strong turn inward

True

False

1 Answer

6 votes
The correct answer is True
User Dharmendra
by
3.2k points