Hey there,
World War II transformed the role of the federal government and the lives of American citizens.They secure industrial peace and stabilize war production, the federal government forced reluctant employers to recognize unions.
Hope this helps :))
~Top