70.2k views
0 votes
How did World War II change the role of corporations in American life? U.S. corporations became friendly and close collaborators with the federal government. With the loss of its overseas affiliates in Asia and Europe, U.S. corporations once again became predominantly American. Technological innovation and high productivity in the war effort restored the reputation of corporations from its Depression lows. The heavy reliance of the Roosevelt administration on corporate leaders for its wartime agencies left U.S. corporations with the stain of government bureaucracy. Thin profits during the war years forced U.S. corporations to dramatically innovate for increased efficiency.

User Brave Dave
by
8.2k points

1 Answer

3 votes

Answer:

Technological innovation and high productivity in the war effort restored the reputation of corporations from its Depression lows.

Step-by-step explanation:

During world war II, large portions of corporations deviate from their normal production and focusing themselves to aid the war by producing high grade military weapons or vehicles.

The products that created by the company was good enough to compete with Germany's production who is notorious to be the best across Europe. This restored the reputation of corporations in united states after previously tied to wage discrimination and blatant environmental destruction from their operation.

User Aamirl
by
8.6k points