169k views
3 votes
The United States changed after WWI by making a strong turn inward

True

False

1 Answer

6 votes
The correct answer is True
User Dharmendra
by
8.1k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.