New York is a Surveillance State

You could end up under police suspicion just for typing words into Google. And when you step outside, an array of cameras could be watching your every move, while a police drone hundreds of feet away zooms in for a closer look at what you’re up to.

You could end up under police suspicion just for typing words into Google. And when you step outside, an array of cameras could be watching your every move, while a police drone hundreds of feet away zooms in for a closer look at what you’re up to. Meanwhile, at your child’s school, a camera outfitted with facial recognition technology is constantly scanning students’ faces as they walk through the halls.

These scenarios are all too plausible in the over-surveilled state of New York. But as frightening as this all may sound, the solution isn’t to cower in fear. There are steps lawmakers can take to secure our privacy.

Resources

Government drone data across NY

Dragnet warrants are trapping innocent people

NY is ignoring the ban on facial recognition in schools

 

Transcript

[00:00:00] Simon: Welcome to Rights This Way, a podcast from the New York Civil Liberties Union, the ACLU of New York State. I’m Simon McCormack, senior staff writer at the NYCLU, and your host for this podcast, which is focused on the civil rights and liberties issues that impact New Yorkers most.

[00:00:24] You could end up under police suspicion just for typing words into Google. And when you step outside, an array of cameras could be watching your every move while a police drone hundreds of feet away zooms in for a closer look at what you’re up to. Meanwhile, at your child’s school, a camera outfitted with facial recognition technology is constantly scanning students’ faces as they walk through the halls. These scenarios are all too plausible in the over-surveilled state of New York. But as frightening as this all may sound, the solution isn’t to cower and fear. There are steps lawmakers can take to secure our privacy. We’ll get into this in just a moment.

[00:01:04] First, I’d like to ask you to please download, subscribe, rate, and review Rights This Way. It will help more people find this podcast. And now I’m joined by two guests, Daniel Schwarz is the NYCLU’s Senior Privacy and Technology Strategist. And Stephanie Coyle is the Deputy Director of the NYCLU’s Education Policy Center.

[00:01:27] Daniel, Stephanie, thank you for coming on Rights This Way.

[00:01:31] Stephanie: Thanks Simon for having us.

[00:01:32] Daniel: Thanks for having us.

[00:01:34] Simon: Great to have you. Daniel, I, I wanna start with you. Um, I mentioned in the intro the idea that people can actually come under suspicion just for Googling something. Can, can you talk about that and about what are called reverse warrants?

[00:01:51] Daniel: Sure. Thanks, Simon. So when we talk about reverse search warrants, we think typically of two types of methods for these newer search warrants. Um, one are the reverse location warrants, and the other type are reverse keyword warrants. And for these search warrants, law enforcement essentially exploits the massive data collection, facilitated or aggregated by big tech nowadays.

[00:02:16] So as opposed to traditional search warrants where police submit a request narrowly about data of a specific suspect to a provider, those search warrants, um, function in a completely different way, essentially in reverse. Not being narrowly drafted, not based on probable cause, but exploit these data aggregation systems and request, in the case of reverse location warrants, the data of anyone that was in a specific radius or parameter at a particular time, or in the case of reverse keyword warrants the data of anyone that searched for a particular search term.

[00:02:56] And as you already can imagine, Google, for obvious reasons, is one of the prime targets of these dragnet warrants. We have over the years, with a large coalition, advocated Google to release data on these type of warrants, especially the geofence warrants because we didn’t know enough about them and um, how grand the harm is, how many people are impacted. And fortunately, Google followed these asks and released data about geofence warrants. And as part of the transparency reports, we now know that from 2018 to 2020, those requests have increased twelvefold, totaling just under 21,000 requests in the US alone. And each of those requests could implicate hundreds or even thousands of people.

[00:03:45] So, the harms are no longer theoretical or academic. Real people – potentially hundreds of thousands of people – are falling under these dragnet searches, are being searched. Uh, the data is being opened up to law enforcement and we’ve seen those implicate people put people under, uh, false arrest. Those reverse search warrants have been used at protests, uh, including at protests against police violence in Wisconsin and in Minnesota.

[00:04:16] We’re asking, um, for New York legislators to stop this practice.

[00:04:20] Simon: And Daniel, so there’s legislation that would end these warrants. Can you talk about what that legislation would would do specifically?

[00:04:29] Daniel: Yeah. Fortunately in New York we have a model bill. It’s the first in the nation, um, that we work together with our partners at STOP. It would prohibit these practices of reverse search and reverse location and reverse keyword warrants. And it is broadly supported by, um, a large number of civil rights, privacy, racial justice, and public defender organizations.

[00:04:51] And it is also supported by the biggest tech companies in the world, including Apple, Microsoft, Google, Twitter, and Facebook. And so we’re really prioritizing this bill in the next legislative session and hoping for the Senate and the Assembly to pass this bill.

[00:05:08] Simon: So now I want, I wanna turn to you, Stephanie, because I think that maybe one of the last places people might still consider a place that maybe doesn’t have super invasive surveillance might be schools, but. actually, you’ve been at the forefront of keeping what’s called biometric surveillance out of schools.

[00:05:29] So how did this come about and what is this technology that, that some companies and school districts in New York State and across the country are, are actually eager to get their hands on?

[00:05:43] Stephanie: Thanks, Simon. So, um, I’ll start by talking a little bit about like what is biometric surveillance, um, and why would someone use that in the school –which, I still don’t know the answer to that question – but, um, biometric surveillance, um, sometimes we call it biometric identifying technology, basically covers a ton of different types of technologies that can identify a person based on

biological, physical or behavioral characteristics like their face, their eyes, their voice, sometimes even how they walk. Um, and it includes facial recognition, which is the type of technology that I think people have heard of the most. And our work in this area really started in a town in western New York called Lockport.

[00:06:25] And that district in 2018 used millions of dollars of state funding to purchase a facial recognition system. And they decided to install it in their schools to use against the children. Um, and we actually found out about this by some concerned parents in the district who were like, ‘What is this? Why, why would the district be spending all this money on the, on a system like this?’

[00:06:48] And so, you know, these systems work in general by using the video surveillance in schools, and Lockport used facial recognition to attempt to identify individuals, including kids as young as five, based on individuals from a watch list. But these systems in general don’t work and are super problematic. So in general, it’s really concerning.

[00:07:09] Schools should be safe places for kids to learn and feel supported, not places where they’re constantly surveilled. Um, and we have particular concerns with facial recognition in schools because these systems are inaccurate. They don’t accurately identify people of color, women, and young people. Um, oftentimes the systems are really biased because the images that they’re comparing in databases are overpopulated with young men of color who are overpoliced in their communities.

[00:07:36] Um, it’s also a real invasion of privacy of the students, the parents, the teachers, everyone that walks through a school. And it really, you know, it treats kids as if they’re suspects, um, and increases the likelihood that kids are gonna have to interact with law enforcement. We also have major concerns about

how the information from the system is stored, and whether it’s protected from hacking, and who has access to it, um, including law enforcement or immigration authorities. And also, you know, if you think about it, money that a school is spending on this sort of technology can be much better spent actually supporting kids and their learning and mental health.

[00:08:15] And so we’ve been working on this issue now for four and a half years, which is wild to think about. You know, once we found out about this, we sent countless, uh, advocacy letters, public records requests. We worked together with community members, organizations from across New York State and the country. We actually sued the state education department twice, um, over its approval of Lockport’s system.

[00:08:39] And the good thing is that, you know, one of the other things that we did is actually worked with legislators and got a law passed in 2020. And so because of that, the Lockport City School District was actually forced to deactivate its system. Um, and right now it is illegal for a school district in New York, or at least it’s supposed to be, uh, to purchase or use a biometric identifying system, that, including facial recognition.

[00:09:05] So we’ve come a long way, but still very concerning that a lot of school districts are seeking these sort of systems.

[00:09:13] Simon: Along those lines, Steph, what would you say to a parent or someone who just says, you know, ‘I’m, I’m so scared about my child’s safety. Kids face so many threats. Um, I, I’m willing to do whatever it takes to make sure my child is safe.’ What is the problem with that argument when it, when it comes to this technology?

[00:09:33] Stephanie: Yeah. So I mean, I would say like that’s everyone’s number one concern at school, right? Schools and school communities, they wanna keep kids safe. Um, but these sort of systems are not the way to do that. You know, I think the concerning thing is what we know keeps kids safe is actually if children feel comfortable confiding in an adult.

[00:09:54] And telling them either that they are having a hard time or one of their friends is, and those sort of like mental health supports are really what keep kids safe. But I think here what we’ve seen is that there are a bunch of predatory, predatory school security companies that have really taken advantage of schools and school communities’ worst fears, um, and have kind of come up with this idea that they sell some product

that is magically going to keep kids safe. But that’s not actually true. And unfortunately, you know, we’re seeing New York State has become a target for these sort of predatory companies because there’s a big pot of money that’s been set aside by the state. And so, you know, what I would say is that what we need to actually invest in is mental health supports for kids and making sure kids feel safe and have a trusted adult at school that they can turn to if something’s going on.

[00:10:45] Simon: What is the, the latest in terms of the bio ban, and keeping this tech? Correct me if I’m wrong, Steph, but I don’t think there’s really been any evidence that this tech has ever stopped a school shooting or done anything like that, but is that fair to say?

[00:11:01] Stephanie: That is fair to say, Simon, yeah. There’s, there’s no evidence that any sort of technology like this, despite the claims of the companies that sell it, uh, there’s no evidence that this has actually kept kids safe. Um, and in fact, we know, you know, there could be real harms from a child that is, uh, misidentified by a system like this.

[00:11:19] If a system misidentifies a kid as not, you know, as someone who is not supposed to be in the school, and the police show up anticipating you know, an intruder, um, that could create a really dangerous situation for that child and for all the, the folks at the school community. And in terms of the latest, so as I mentioned, in 2020 we were able to work to enact a moratorium in New York State on the purchase or use of any sort of biometric identifying technology in schools, and that includes facial recognition. And this was actually in direct response to what happened in Lockport. Um, and this bill was the first of its kind, uh, in the country. So part of what the ban does, it’s a temporary ban, and inaddition to, you know, prohibiting the purchase and use of any sort of biometric identifying technology, it also requires that the State Education Department work together with the Office of Information Technology Services to study whether it’s a good idea for these sort of technologies to be used in schools.

[00:12:22] Um, and it requires them to put together a report and hold public hearings. And so this law has been in effect since the end of 2020. But unfortunately, we actually have uncovered through some research at the NYCLU that there are school districts who have actually purchased facial recognition systems with this state funding.

[00:12:42] And I mentioned earlier, but Lockport used a pot of money that, um, is available through the state of New York called the Smart Schools Bond Act. And it actually set aside, in 2014, 2 billion dollars that most schools used to upgrade their wifi, buy laptops, smart boards, like, other instructional technology that actually helps kids.

[00:13:04] Um, but some districts used it on high tech security. Because there’s this big pot of money, New York has really become a target for companies that want to sell their products, and unfortunately, there’s not a lot of oversight about, you know, what is being purchased with this money. And we found that a number of school districts have purchased video surveillance systems that include facial recognition.

[00:13:28] Simon: Daniel, I wanna ask you about the level of surveillance that, that we experience just when we’re walking around outside, especially, I know in, in New York City, which is maybe especially surveilled, but what does that look like?

[00:13:43] Daniel: That’s a great question, Simon. Unfortunately, we don’t have a clear answer to that because the NYPD has been fighting tooth and nail to release even the most basic information. We have the Post Act, which is a very modest, limited, um, surveillance transparency mandate on the NYPD, and they have not followed that mandate – really just hiding, obscuring their information, keeping the systems opaque, and not being transparent about what the actual surveillance infrastructure looks like. What we do know about the state of street level surveillance that you are alluding to is its various intersecting technologies that come together there.

[00:14:24] So as you can imagine, there are tens of thousands of networked surveillance cameras spread out across New York. There are ShotSpotter audio recording devices that claim to be able to detect when a gunshot is occurring in a neighborhood and triangulating its location and calling immediately 911, um, response to that location.

[00:14:47] There’s various environmental sensors and automatic license plate readers, potentially cell site simulators that the NYPD has been using for a decade, and various other technologies, including technologies deployed by outside agencies, for example, through LinkNYC kiosks that are now taking the spots of former, um, phone, phone booths. And perhaps the technology that we’ve spent most attention to in, in the street level surveillance at the moment that that is linking with the biometric surveillance aspect that Steph was talking about in schools are basically the capabilities that these network cameras have, and they’re all brought together in the domain awareness system,

which is the central hub of the NYPD surveillance technologies. So, you have these tens of thousands of cameras, um, you have all these sensors, you have other previously siloed databases that can be analyzed and correlated and combined with information, there’s analytics, there’s machine learning and AI predictive technologies.

[00:15:52] And these predictive technologies are also really concerning thinking about the unconstitutional and racially biased stop and frisk practices of the police. So, if those predictive systems make use of the historical policing data, we know already that the outcomes of those systems will be similarly biased, especially racially biased, and that is what academics have called bias in, bias out in terms of those machine learning systems.

[00:16:18] As for the cameras, we know that the NYPD has been using facial recognition for over a decade. Information is very sparse, so we don’t have many details. Fortunately, the Georgetown Center of Privacy and Technology had four years litigation to get basic information about these systems, and what they have found is incredibly concerning, and very unscientific practices from photoshopping suspects’ photos, to using celebrity lookalikes, to entering stale data that should be deleted, juvenile photographs, et cetera, et cetera.

[00:16:52] Beyond facial recognition, they also have other video analytics capabilities, object recognition, behavior detection – behavior detection, emotional recognition, is similarly affected by racial bias, um, what Steph has mentioned earlier with regards to facial recognition. We also don’t know the absolute sheer scale of the cameras deployed.

[00:17:13] Our friends at Amnesty International have run a great project called, um, the decoders project under the Ban the Scan campaign, um, where they crowdsource detected cameras all over New York through Google street view imagery at intersections. So by this very nature, you know, already, that this is only an estimate attempting to get at the absolute number of cameras.

[00:17:35] But we don’t have a better way at this point to understand just what the scale is of NYPD CCTV cameras. And through their crowdsourced research, they came to a total of around 25,000 cameras at road intersections, um, that includes public and private cameras. In a federal grant report study, because the domain awareness system is majorly funded by federal DHS grants, the number mentioned there was already 50,000 cameras. So what is also really concerning is the software that is being used. And we know already that the NYPD has submitted at least 11,000 searches to the infamous vendor, Clearview AI, which has been a lot in the news about its unethical practices of collecting billions of photographs from social media sites.

[00:18:23] So, virtually everyone that has ever used the social media site and didn’t have the account fully locked down, whether that’s Facebook, Instagram, Twitter, LinkedIn, et cetera, et cetera, those photos are very likely to be in the Clearview AI database, and puts, basically everyone who has used those platforms over the last decade in the cross hairs of law enforcement. And that’s not just the NYPD that has used it. All over New York, more than 60 law enforcement departments have piloted Clearview AI.

[00:18:55] Simon: And so one of the surveillance tools that we’re seeing used more and more is the police use of drones. Can you talk about that and about what we and others are doing to put the clamps on this?

[00:19:09] Daniel: Thanks, Simon. Yeah, so drone use has also exploded over the years based on the advancements in drone technology – um, the cheaper costs and better capabilities of flying longer, navigate more autonomously, need less training from staff and just, uh, have better cameras and AI technologies on board. Also, here we were suffering from an information scarcity where only very few departments were disclosing information about the drones they have and under what circumstances, um, they’re using them and what limitations there are and protections.

And unfortunately we don’t have any laws in New York that regulate or limit police drones. Until recently, we were only aware of around 59 law enforcement drones in New York.

[00:19:59] Um, but we were suspecting that there are far more around. So we submitted, um, a freedom of information request to the FAA, the Federal Aviation Administration, where every drone has to be registered. And thanks to this request, we now have a complete picture of all the government drones in New York, and the numbers are staggering.

[00:20:21] New York government drones make up around 530 drones, and of those, 330 drones are registered by police departments and law enforcement agencies. Those are spread all over New York, and, um, as I said, we don’t have laws protecting New Yorkers from their use, so they can be used for all sorts of spying activities.

[00:20:42] They can be used at protests, and we’ve seen just that. If we look back to 2020, DHS used military hardware, the predator drone, to spy on protestors. The ACLUof California uncovered aerial surveillance of protests in, in, all over California. Here in New York, now Mayor Adams was mentioning the use of drones combined with the ShotSpot audio recording devices on rooves all over the city.

[00:21:10] And also if we look back to 2020, in the early response days to the Covid pandemic, a police department in Connecticut basically fell prey to snake oil sales people that were selling drones with biometric surveillance that were supposed to measure heart rate and fever, and cough detection, sneezing detection, which of course was completely ineffective and the wrong response to the pandemic.

[00:21:37] And so what we’re seeing in our FOIA data is that the use is growing and we desperately need to curtail the use specifically of police drones. And we, we actually have a bill for that. It is introduced by Senator Ramos, and Assembly Member Kim. It would prohibit police drones from surveilling First Amendment protected activities, so any protest and other activities would be protected from drone surveillance.

[00:22:07] It would also require warrants for all other police drone usage, and it would set guard rails that drones cannot be used with discriminatory and harmful technologies such as facial recognition. Um, that they cannot be used with crowd control devices or with weapons – and again, because there’s no laws, they theoretically could be combined with weapons. And we’ve seen police departments in other parts of the country do exactly that.

[00:22:37] Simon: And so, okay. As we kind of wrap up, I do just wanna acknowledge that one of the unintended consequences maybe of, of doing an episode like this – which is certainly designed to, to make people aware of the dangers and ubiquity of, of a lot of the surveillance technology that we are experiencing – but I think some people may hear that and be concerned about that, but at the same time, sort of think, you know, well this is just the way it is.

[00:23:02] Like if I’m Googling, if I’m sending my kid to school, if I’m walking around outside, if I’m at a protest, it’s just a matter of life that my activities are gonna be surveilled. And as we wrap up, I, I want to kind of get your thoughts on how we fight against that sort of inevitablism or that sort of feeling overwhelmed and like there’s nothing we can do, because that, that’s not actually true.

[00:23:27] And, and we’ve certainly mentioned legislation and things that we’ve done to, to curtail this. So I don’t want to say that, you know, we haven’t already talked about ways to fight this, but I, I just want to sort of on a psychological or sociological level, talk about how we maintain hope in, in the face of this surveillance.

[00:23:44] Stephanie: For those folks who maybe have lost hope listening to all of this, um, kind of scary science fiction, I would say, it doesn’t have to be this way. And I think the Lockport example is a great, you know, is proof of that. Parents and students found out about this and decided that they didn’t want their kids to be subjected to this type of high tech surveillance and they did something about it, right?

[00:24:07] They enlisted help and they organized each other, you know, worked with us. We were able to put together a coalition and file some lawsuits and work with the legislature. And so I think, you know, paying attention to what’s happening in your communities, attending school board meetings to find out what is happening and whether your district’s doing something like this are all things that you can do.

[00:24:30] Because I think you know, a lot of these – particularly in schools, right – purchases of these sort of things, they’re not super transparent. And so I think the more that we’re able to shine a light on them, then people can take action against them. And a concrete action that folks can do is, you know, I mentioned the state law, um, that we got passed.

[00:24:49] It requires a study and it requires that the Office of Information Technology Services and the State Education Department – they just held a public hearing – um, but they have to put out a report and they have to consult stakeholders, and that’s anyone who cares about this issue. You don’t have to have a child in school, anyone.

[00:25:10] And so they are accepting comments and they put out a survey. And so some of the things you can do is respond to that survey and give your opinion about why you think, you know, biometric surveillance should not be in schools. Um, and it’s really important for folks to have their voices heard, particularly parents and students and folks in school communities.

[00:25:28] Because that report, once it is put together, will inform whether or not school districts can use facial recognition in New York State. And we really hope that what comes out of this is that the moratorium becomes a permanent ban. Um, because, for so many reasons, um, we don’t think that facial recognition and other sorts of biometric surveillance belong in schools. Make your voice heard.

[00:25:54] And you know, it’s really important for folks to understand what’s going on in the schools in particular and take action if they don’t want these technologies, uh, near kids.

[00:26:05] Daniel: That’s great Steph, and I think on top of what, what Steph has said, I think it’s perfectly understandable why people can feel helpless or powerless because all those surveillance technologies are always deployed at a great, um, power imbalance. There’s an understandable lack of control and it’s only natural to feel that way. But at the same time, I think people are becoming more and more cognizant of the harmsof these technologies and are choosing to fight back

[00:26:36] And there’s so many surfaces to fight back on and levels to engage on, and we, we are seeing, we have the greatest choice ever of more privacy preserving technologies, although it can, can feel the opposite way. There’s more open source research, there’s more development on privacy preserving technologies.

[00:26:56] Vendors are shifting, nonprofits are paying attention, and people are in the streets fighting for their rights to privacy and, um, fighting against technologies that negatively impact their civil rights and civil liberties. Beyond the individual level, of course, what we wanna focus on is the systemic level, and that largely happens on the legislative level.

[00:27:17] And beyond the bills and laws we are fighting for in New York, we also see great ripple effects from, um, legislative fights in other areas. So, California laws can also protect New Yorkers here. Um, the biometric protections in Illinois help New Yorkers against biometric surveillance technologies. And data protections in the European Union have also had effects and impacts on our systems here.

[00:27:44] So, it’s encouraging to see the networking effects and how protections in other areas can also protect people elsewhere. Here in New York, people were in the streets in Syracuse and protesting against the effects of surveillance technologies with regards to racial justice. And it resulted in the, in the mayor issuing an executive order to ban facial recognition, to ban predictive policing, and to create a surveillance technology oversight working group.

[00:28:12] So, I’m hopeful that as we’re seeing more and more people going on the streets, becoming active, fighting for legislation, and also just choosing technologies that have their best interests in mind and not the company’s bottom line, we’re seeing a shift. So, overall, I think I’m, I’m very optimistic to what’s to come.

[00:28:32] Simon: Well with that, Steph, Daniel, thank you so much for being on Rights This Way.

[00:28:38] Stephanie: Yeah, thanks Simon.

[00:28:39] Daniel: Thank you so much, Simon.

[00:28:42] Simon: Thank you for listening. You can find out more about everything we talked about today by visiting nyclu.org. And you can follow us @nyclu on Instagram, Twitter, and Facebook. If you have questions or comments about Rights This Way, you can email us at podcast@nyclu.org. Until next time, I’m Simon McCormack.

[00:29:04] Thank you for fighting for a fair New York