Monday Morning Thoughts: Should Davis Implement Facial Recognition Technology – Experts Consider It Racially Biased

By David M. Greenwald
Executive Editor

Davis, CA – Last week, Tim Kingston of the Senior Investigator/member of the SF Public Defender Racial Justice Committee wrote a letter to the Davis Police Accountability Commission supporting a ban on Facial Recognition Technology.

“FRT needs to be flat out banned,” he writes.  “As you are probably all aware  FRT is fundamentally biased against people of color. This is due to its failure to use demographically accurate training sets, resulting in a classic garbage-in garbage-out technology. This is a foundational problem with the technology.”

A 2020 study out of Harvard, for example, noted that while “[f]ace recognition algorithms boast high classification accuracy,” over 90 percent in fact, of “these outcomes are not universal.”

In the landmark 2018 “Gender Shades” project, an “intersectional approach was applied to appraise three gender classification algorithms, including those developed by IBM and Microsoft. Subjects were grouped into four categories: darker-skinned females, darker-skinned males, lighter-skinned females, and lighter-skinned males. All three algorithms performed the worst on darker-skinned females, with error rates up to 34% higher than for lighter-skinned males.”

Independent assessment by the “National Institute of Standards and Technology (NIST) has confirmed these studies, finding that face recognition technologies across 189 algorithms are least accurate on women of color.”

The Harvard Study noted, “This result corroborated an earlier assessment of Rekognition’s face-matching capability by the American Civil Liberties Union (ACLU), in which 28 members of Congress, disproportionately people of color, were incorrectly matched with mugshot images.”

The ACLU in 2020 concluded, “Face surveillance is the most dangerous of the many new technologies available to law enforcement. And while face surveillance is a danger to all people, no matter the color of their skin, the technology is a particularly serious threat to Black people in at least three fundamental ways.”

The results here are not theoretical.

Scientific American last year told the story of a man who spent hours behind bars because police used facial recognition software on footage from a store.

This resulted in the arrest of Robert Williams, an innocent Black father in suburban Detroit.

They write, “Sadly, Williams’ story is not a one-off. In a recent case of mistaken identity, facial recognition technology led to the wrongful arrest of a Black Georgian for purse thefts in Louisiana.”

Their research found, “Our research supports fears that facial recognition technology (FRT) can worsen racial inequities in policing. We found that law enforcement agencies that use automated facial recognition disproportionately arrest Black people. We believe this results from factors that include the lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.”

Last year Forbes noted, “A 2022 study conducted by institutions including Johns Hopkins University and the Georgia Institute of Technology programmed AI-trained robots to scan blocks with people’s faces from different races.

“After the robots scanned the faces, they were tasked with designating which blocks were criminals—they consistently labeled the blocks with Black faces as criminals.”

Last year the University of Calgary interviewed Dr. Gideon Christian, an expert on AI and the law.

“There is this false notion that technology unlike humans is not biased. That’s not accurate,” says Christian, PhD. “Technology has been shown (to) have the capacity to replicate human bias. In some facial recognition technology, there is over 99 per cent accuracy rate in recognizing white male faces. But, unfortunately, when it comes to recognizing faces of colour, especially the faces of Black women, the technology seems to manifest its highest error rate, which is about 35 per cent.”

This is “an unacceptable error rate, with damaging effects.”  Christian cites cases in the U.S. when Black men were misidentified by facial recognition software, arrested and detained.

He added, “What we have seen in Canada are cases (of) Black women, immigrants who have successfully made refugee claims, having their refugee status stripped on the basis that facial recognition technology matched their face to some other person. Hence, the government argues, they made claims using false identities. Mind you, these are Black women — the same demographic group where this technology has its worst error rate.”

Even aside from the problem of misidentification, the ACLU believes that facial recognition would simply reinforce existing disparities, if not exacerbate them.

For example, they cite the fact that “Black people are nearly four times more likely to be arrested for marijuana possession than white people.”

Why that matters is that, even though this is a trivial offense, “[e]ach time someone is arrested, police take a mugshot and store that image in a database alongside the person’s name and other personal information.”

Thus, because Black people are far more likely to be arrested for minor crimes, “their faces and personal data are more likely to be in mugshot databases.”

Therefore, “the use of face recognition technology tied into mugshot databases exacerbates racism in a criminal legal system that already disproportionately polices and criminalizes Black people.”

The ACLU goes even further, however, arguing even if the government were to switch to driver’s license databases, “government use of face surveillance technology will still be racist. That’s because the entire system is racist.”

Here they cite Radley Balko, who has “carefully documented that Black people face overwhelming disparities at every single stage of the criminal punishment system, from street-level surveillance and profiling all the way through to sentencing and conditions of confinement.”

Where are the surveillance cameras?  They are “disproportionately installed in Black and Brown neighborhoods.”

All of that aside, for now the big reason we should avoid using facial recognition is that the error rate is unacceptably high and the risk of arrest for innocent Black and brown people should give us pause.

About The Author

David Greenwald is the founder, editor, and executive director of the Davis Vanguard. He founded the Vanguard in 2006. David Greenwald moved to Davis in 1996 to attend Graduate School at UC Davis in Political Science. He lives in South Davis with his wife Cecilia Escamilla Greenwald and three children.

Related posts

Leave a Reply

X Close

Newsletter Sign-Up

X Close

Monthly Subscriber Sign-Up

Enter the maximum amount you want to pay each month
Sign up for