168,478 views
25 votes
25 votes
Question For Serious People!

Since the federal government operates a number of facial recognition systems, such as the DHS at the border, why can't they use the technology to find the people behind the capitol crimes of January 2021?

User Luzny
by
2.7k points

2 Answers

27 votes
27 votes
"Condensed Milk" is a short
Varlam Shalamov is a short story by
w (1907-1982), a Russian
wer 15 years of his life at a
forced-labor camp. The
based on his real-life
writer to spent over 15 year
Gulag, a Russian forced-lab
story was written based on
experience.


"Condensed Milk" is narrated from a first-
person perspective by the unnamed
protagonist. The narrative is chronological,
which means the narrator presents the
events and conflicts in the order they
happen. The advantage
User Tiana
by
3.3k points
20 votes
20 votes

Answer:

Down Below

Step-by-step explanation:

From Los Angeles Times(Paraphrased)

Facial recognition technology identifies similarities by seizing, cataloging, and monitoring databases of millions of images of people's faces — 641 million as of 2019, in the case of the FBI's facial recognition unit — Images can be retrieved from government systems, such as driver's licenses, or, in the case of Clearview AI, from files copied from social networks or other internet sites.

According to research, technology has failed to correctly identify people of color. According to a federal study published in 2019, black and Asian people were approximately 100 times more likely than white people to be misidentified by facial recognition.

The issue could be with how and who trains the software. According to a study published by New York University's AI Now Institute, artificial intelligence can be shaped by the environment in which it is built. This includes the tech industry, which is notorious for its lack of gender and racial diversity. According to the study, such processes are almost exclusively developed in environments that "tend to be extremely white, affluent, technically oriented, and male." The lack of diversity may extend to the data sets used to train facial recognition software, as studies have revealed that some were largely trained utilizing databases of images of light-skinned males.

This pinpoints to one thing; Facial Recognition could be used, but it will have a lot of Race/Color Biases because, after all, it is a machine.

User Illishar
by
2.5k points