231,345 views
0 votes
0 votes
What permanent changes to the federal government resulted from the New Deal?

User Dubafek
by
2.8k points

1 Answer

6 votes
6 votes

Answer:

The New Deal redefined the role of the government, convincing the majority of ordinary Americans that the government not only could but should intervene in the economy as well as protect and provide direct support for American citizens.

Step-by-step explanation:

User Suly
by
3.0k points