45.1k views
0 votes
How did WWI affect the United States at home, and what finally brought the war to an end?

User Blafasel
by
4.2k points

2 Answers

0 votes
It affected the US because it lead to the Roaring 20s, workplace equality for women, a push for racial equality, and a boom in the economy.
User Malvika
by
4.2k points
4 votes

Answer:

The entry of the United States was the turning point of the war, because it made the eventual defeat of Germany possible. It had been foreseen in 1916 that if the United States went to war, the Allies' military effort against Germany would be upheld by U.S. supplies and by enormous extensions of credit.

Step-by-step explanation:

User TarranJones
by
4.2k points