128k views
1 vote
How did the lives of Americans change after ww2

User Crmpicco
by
7.7k points

1 Answer

4 votes

The entry of the United States into World War II caused vast changes in virtually every aspect of American life. ... Building on the economic base left after the war, American society became more affluent in the postwar years than most Americans could have imagined in their wildest dreams before or during the war.


User Yieldsfalsehood
by
8.0k points