Back to All Commentary

This Company Sold Schools Facial Recognition Tech It Knew Was Racist. The Reality Was Even Worse

WEB20-FacialRecognition-1200x630
By: Simon McCormack Senior Writer, Communications & Stefanie Coyle Deputy Director, Education Policy Center

The company that makes the facial recognition system installed in Lockport, New York’s public schools admits its system is less accurate when used on Black people.

It claims the algorithm the system uses to identify people misidentifies Black men twice as often as white men and misidentifies Black women 10 times more often than white men.

That’s not exactly reassuring. But an independent audit of SN Technologies’ system, obtained by the NYCLU found that the real numbers are even worse. The audit said the system misidentified Black men four times more often than white men, and Black women 16 times more often.

The audit does not address the algorithm’s accuracy on children, but multiple studies have found facial recognition is particularly faulty when used on young people.

Documents we reviewed also show that the system has misidentified objects as guns. That’s especially worrying because these types of misidentifications could lead to police officers being deployed to a school, expecting a live-shooter incident. Even if there is no actual shooter in these situations, police arriving expecting one has the potential to traumatize students and potentially put them in danger.

The audit said the system misidentified Black men four times more often than white men, and Black women 16 times more often.

The NYCLU has been sounding the alarm about Lockport’s system since 2018. We are concerned about the racially biased algorithms employed by the system and the technology’s potential to track students’ every move – turning their youthful behavior into evidence of a crime. We are also concerned the system captures the faces of students, and could make that biometric information a target for hackers or other bad actors to steal.

Earlier this year, we sued the New York State Education Department after it gave Lockport the go-ahead to use the facial recognition system. There are significant questions as to whether the department knew SN Technologies’ claims about its system’s accuracy were inflated and gave Lockport the go-ahead anyway.

SN Technologies’ faulty claims highlight another problem with facial recognition in schools. Districts are basing their decisions to deploy this technology on unfounded assertions from companies that have every incentive to exaggerate their products’ effectiveness in order to secure millions of dollars in new contracts.

School districts are not equipped to be able to weigh these claims effectively and are likely to do what Lockport did – throw money at a company that made promises it couldn’t back up.

This sort of dynamic could play out across the state, as nearly a dozen districts have indicated an interest in putting facial recognition in their schools. This number could continue to grow and the racist algorithms, criminalization, and privacy invasions will spread.

But there’s a way to put the breaks on this.

There is a bill on Gov. Cuomo’s desk that would keep facial recognition and other biometric surveillance out of schools and require a study of the impacts of this technology.

You can tell Cuomo to sign the bill here.

As bold as the spirit of New York, we are the NYCLU.
Donate
© 2024 New York
Civil Liberties Union