104,633 views
30 votes
30 votes
What social changes were brought about in America during WWI?

User Nail
by
3.4k points

1 Answer

28 votes
28 votes

Answer:

the conflict heralded the rise of conscription, mass propaganda, the national security state and the FBI. It accelerated income tax and urbanisation and helped make America the pre-eminent economic and military power in the world. The influence and expectations of women and their role in society increased. There was a boom in the economy and industries were enjoying an increase in profits. However inflation was high and, in an attempt to reduce operating costs, businesses laid off workers and reduced wages.

User Evan Cordeiro
by
2.9k points