106k views
0 votes
How did American culture change during the 1920s?

User Roshit
by
3.6k points

1 Answer

1 vote

Answer:

The 1920s was a decade of profound social changes. The most obvious signs of change were the rise of a consumer-oriented economy and of mass entertainment, which helped to bring about a "revolution in morals and manners." Sexual mores, gender roles, hair styles, and dress all changed profoundly during the 1920s.

Step-by-step explanation:

hope it helps you

User Miedkes
by
3.4k points