207k views
5 votes
Did world war 1 change our role as a world leader

User Soni Ali
by
4.7k points

1 Answer

4 votes

Answer:

Despite isolationist sentiments, after the War, the United States became a world leader in industry, economics, and trade. The world became more connected to each other which ushered in the beginning of what we call the “world economy.”

User AmigaAbattoir
by
5.0k points