27.8k views
7 votes
4. How did the war benefit Americans?

User Jim Wright
by
7.8k points

1 Answer

11 votes

The war brought full employment and a fairer distribution of income. Blacks and women entered the workforce for the first time. The war also brought the consolidation of union strength and far-reaching changes in agricultural life.

User Ingmars
by
9.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.