172k views
0 votes
What major historical event significantly altered America's international role?

a) World War I
b) The Great Depression
c) The Civil Rights Movement
d) The Boston Tea Party

User ColCh
by
7.1k points

1 Answer

2 votes

Final answer:

The event that significantly changed America's role on the international stage was World War I, as it led to increased U.S. influence post-war and involvement in World War II following a period of isolationism.

Step-by-step explanation:

The major historical event that significantly altered America's international role is World War I (Answer d). After World War I, the United States emerged as a major economic power and began to play a more substantial role in global affairs. The Treaty of Versailles and the creation of new nations highlighted the increased influence America had on the world stage. This period also saw the rise of totalitarian regimes in Europe and expansionist policies in Asia, leading to World War II, in which America's role was crucial. During the interwar period, the U.S. initially turned towards isolationism, focusing on its own economic issues, including the Great Depression. However, the attack on Pearl Harbor marked a definitive end to isolationism and propelled the U.S. into a position of a global leader, particularly following its significant contributions to the Allied victory in World War II.

In the 1790s, Americans were deeply divided over the French Revolution, as it represented differing ideals of governance and international alliances which were reflected in the domestic politics of the time (Answer c).

User Bluemind
by
7.6k points