206k views
0 votes
United States experiences during the war

User NeliCZka
by
6.7k points

1 Answer

3 votes

Answer:

World War one had large effects on America including the profound gain of the industry that came before the Great Depression, jobs given to women, which helped pass the nineteenth amendment, and another sort of diplomacy and antiwar opinion that influenced America's passageway into World War II.

Step-by-step explanation:

User Jorge Cohen
by
6.0k points