27.8k views
7 votes
4. How did the war benefit Americans?

User Jim Wright
by
4.2k points

1 Answer

11 votes

The war brought full employment and a fairer distribution of income. Blacks and women entered the workforce for the first time. The war also brought the consolidation of union strength and far-reaching changes in agricultural life.

User Ingmars
by
4.3k points