Answer:
When World War II ended, the United States was in better economic condition than any other country in the world. ... Building on the economic base left after the war, American society became more affluent in the postwar years than most Americans could have imagined in their wildest dreams before or during the war.
Step-by-step explanation: