189k views
3 votes
When the World War I broke out, the United States declared its policy of neutrality. Was the United States ever neutral in the conflict, and if so, when did it change to a policy of favoring the Allies?

User Agathe
by
5.9k points

2 Answers

6 votes

On this day in 1914, as World War I erupted across Europe, President Woodrow Wilson declared that the United States would remain “impartial in thought as well as in action.” ... In February 1915, Germany declared unrestricted warfare against all ships, neutral or not, that entered the war zone around Britain.

User Rick Sanchez
by
6.7k points
1 vote

The US was never actually 'neutral.' They didn't officially pick a side in the aspect at first but they did sell ammunition which was providing ammo to the Allies. Thus by doing so, they chose to help the Allies, picking a side.

User Mhodges
by
6.0k points