188k views
13 votes
The nazis did not come to power in Germany promising to:

User Bladexeon
by
4.1k points

1 Answer

3 votes

Answer:

The Nazi Party was one of a number of right-wing extremist political groups that emerged in Germany following World War I. Beginning with the onset of the Great Depression it rose rapidly from obscurity to political prominence, becoming the largest party in the German parliament in 1932.

Step-by-step explanation:

User Eveningsun
by
4.4k points