Answer:
The united states role after world war 1.
Step-by-step explanation:
The United States entered World War I and became victorious with the allied group. To punish, the Central Powers forced to sign certain treaties. Under the Treaty of Versailles, Germany forced to pay reparations to the allied powers. After the WWI, American banks provided loans to European countries to help them to make reparations payments. Germany pays back more on their war debts to the United States.