48.0k views
0 votes
what are some positive and negative changes that occured in the United States in the years after World War 2?

1 Answer

4 votes

Answer:

American society became more affluent in the postwar

Step-by-step explanation:

User Kasun Kodagoda
by
3.5k points