211k views
0 votes
In popular culture, the cowboy is mostly depicted as a solitary, strapping white male who happens to be good with a gun and will resort to violence when needed. The reality was that cowboys worked in groups, weren’t all white (or big), and weren’t required to have gun skills to work the job. If this is the case, why did/does popular culture portray cowboys the way they’re shown? Please provide two to three possible answers.

User Koalo
by
6.1k points

1 Answer

5 votes

Answer:

Americans have a tendency to white wash history. Most of these stereotypes came from Hollywood movies. When one original movies set the scene the future movies tend to follow.

Step-by-step explanation:

User Bsiddiqui
by
5.1k points