134k views
15 votes
After WWI, what did Germany feel it needed to do as a country?

Group of answer choices

provide more jobs and build schools

reclaim territory and regain prominence in Europe

start another war

support the people and give financial assistance

User Nathan Roe
by
6.2k points

1 Answer

2 votes
I think reclaim territory and regain prominence in Europe
User Vanesa
by
7.2k points