9.4k views
2 votes
What role has Hollywood played in the portrayal of racism in America

User XerXes
by
8.1k points

1 Answer

7 votes

Final answer:

Hollywood has historically perpetuated racism through the exclusion and misrepresentation of the Black experience in films, reinforcing negative stereotypes and limiting opportunities for Black actors and filmmakers.

Step-by-step explanation:

Hollywood has played a significant role in the portrayal of racism in America. Historically, the film industry has perpetuated racism by distorting or excluding the Black experience from art, thereby reinforcing negative stereotypes and limiting opportunities for Black actors and filmmakers. Additionally, Hollywood has often relied on White writers and directors to tell stories about racism and slavery, which has limited the authenticity and diversity of these narratives. The lack of representation and misrepresentation of Black people and culture in film and television has contributed to the dismissal and marginalization of Black perspectives in society.

User Elewinso
by
7.4k points