85.2k views
0 votes
How did the lives of americans change after world war 2

User Ling Vu
by
5.6k points

1 Answer

3 votes
The entry of the United States into World War II caused vast changes in virtually every aspect of American life. ... Building on the economic base left after the war, American society became more affluent in the postwar years than most Americans could have imagined in their wildest dreams before or during the war.
User Muhammad Shaharyar
by
5.6k points