Final answer:
Gender roles refer to society's expectations of how men and women should behave based on norms created by society.
Step-by-step explanation:
The term gender role refers to society's concept of how men and women are expected to look and how they should behave. These roles are based on norms, or standards, created by society. In U.S. culture, masculine roles are usually associated with strength, aggression, and dominance, while feminine roles are usually associated with passivity, nurturing, and subordination. Role learning starts with socialization at birth.