319,185 views
26 votes
26 votes
What caused changes to the traditional American culture after World War II ended?

User Branka
by
3.0k points

1 Answer

14 votes
14 votes

Answer:

Several major factors contributed to cultural change in the United States after World War II, but one of the greatest was a trend known as consumerism. Once the war was over, the revved-up US industry turned from creating weapons and other materials for the war effort to automobiles, appliances, and other products that Americans could purchase and own.

Step-by-step explanation:

Hope this helps!

User Bert Jan Schrijver
by
3.6k points