55.7k views
13 votes
What were the effect of ww1 on America?

User Benzkji
by
7.5k points

2 Answers

13 votes

Answer:

The World War 1 experience impacted hugely on U.S. culture, domestic politics and society. The war also resulted in an increased demand for weapons abroad. This led to increased profits and heightened productivity in the American steel industry. World War 1 ushered in an era of using chemical weapons.

User Dave Stallberg
by
7.4k points
4 votes

Answer:

answers are below

Step-by-step explanation:

Positive effects:

- The war made it possible for America to be the leading world power.

- America became the industrial hub around the world.

- Women obtained the right to vote

- African Americans and women gained the right to work a variety of jobs

- As industry boomed, the economy boomed.

- More previously unemployed people held jobs, and the finances of the public, which had been poor since the recession of 1897, improved.

- America Joins alliance with Great Britain and France

Negative effects:

- Industry production went down when the soldiers came home.

- There were not enough jobs for the returning soldiers.

- The unemployment rates played a part in bringing the Great Depression.

- Americas haste to join war caused many soldiers to die.

Answers from Prezi.

hope this helps :)

User Czerwin
by
7.1k points