Increasingly, shopping at a store, going to school, driving your car, or even entering your own home could mean having your sensitive biometric information tracked and analyzed. Tools like facial, voice, or behavior recognition are being unleashed by governments and private industry, posing serious threats to our privacy, our security, and our civil rights.

Facial recognition is notoriously inaccurate, especially when it comes to identifying women and people of color. Other forms of biometric recognition are shown to be similarly inaccurate and plagued by disparate impacts for Black people. These biased and error-prone tools are especially likely to be used on people of color, and they are becoming widespread with almost no regulation or oversight.

The NYCLU works to put a stop to the most harmful of these biometric surveillance technologies and to challenge discriminatory and abusive surveillance practices.