96.8k views
5 votes
How were german americans treated during WWI

User Dboswell
by
5.4k points

1 Answer

3 votes

Answer:

After WW1, Germany was treated as a pariah, humiliated by the victors, lost territory and forced to pay huge reparations.

Step-by-step explanation:

User Petras Purlys
by
6.6k points