95.8k views
4 votes
Imagine that the goal of a project is to detect people at work. The model has been fed to thousands of training data where men are coding and women are cooking. The algorithm is likely to learn that coders are men and women are chefs, which is wrong since women can code and men can cook. This situation is an example of Prejudice Bias.

Select one: True OR False

User Miraculixx
by
7.1k points

1 Answer

3 votes

Final answer:

False. The given situation is an example of Prejudice Bias. The algorithm learned and reinforced biased stereotypes by associating coding with men and cooking with women.

Step-by-step explanation:

False

The given situation is an example of Prejudice Bias. Prejudice bias occurs when a model or algorithm is trained on biased data and therefore learns and reinforces biases and stereotypes. In this case, the algorithm learned that coders are men and women are chefs, which is incorrect and perpetuates gender stereotypes. It is important to address and mitigate prejudice bias in machine learning to ensure fairness and equality.

User Henon
by
7.9k points