Answer:
The New Deal redefined the role of the government, convincing the majority of ordinary Americans that the government not only could but should intervene in the economy as well as protect and provide direct support for American citizens.
Step-by-step explanation: