197k views
0 votes
What social changes were brought about in America during WWI?

1 Answer

4 votes

Answer:

the conflict heralded the rise of conscription, mass propaganda, the national security state and the FBI. It accelerated income tax and urbanisation and helped make America the pre-eminent economic and military power in the world. The influence and expectations of women and their role in society increased. There was a boom in the economy and industries were enjoying an increase in profits. However inflation was high and, in an attempt to reduce operating costs, businesses laid off workers and reduced wages.

User Bob Thule
by
4.9k points