171k views
2 votes
How did the United States change after World War II

1 Answer

4 votes

Answer:The entry of the United States into World War II caused vast changes in virtually every aspect of American life. ... Building on the economic base left after the war, American society became more affluent in the postwar years than most Americans could have imagined in their wildest dreams before or during the war.

Step-by-step explanation:

User Priboyd
by
4.7k points