106k views
0 votes
How was America different following World War II?​

User Vermaete
by
4.7k points

1 Answer

5 votes

Answer:

When World War II ended, the United States was in better economic condition than any other country in the world. ... Building on the economic base left after the war, American society became more affluent in the postwar years than most Americans could have imagined in their wildest dreams before or during the war.

Step-by-step explanation:

User Colin FAY
by
5.5k points