Comments to the OAG's office re: Stop Addictive Feeds Exploitation (SAFE) for Kids Act
Civil Liberties Union
October 7, 2014
The New York Civil Liberties Union (NYCLU) is pleased to provide public comments on the rules proposed to govern the New York City Identification Card program (NYC ID). While supportive of the concept of a local ID card that carries benefits for marginalized New York City residents, the NYCLU remains concerned about the absence of adequate privacy protections: first in the law itself, and now in the rules proposed to implement it.
With 50,000 members and supporters and eight offices statewide, the NYCLU is the foremost defender of civil liberties and civil rights in New York State and a longstanding advocate for the right of all people to access government functions and services. Our priorities include advocating for the rights of immigrants across New York, while also protecting individuals’ privacy from government intrusion and unnecessary risk. When the NYC ID was signed into law in July 2014, we were disappointed at its lack of meaningful privacy protections and its disregard for the serious risks that participants in the program would be taking.
Specifically, the NYCLU cannot support the NYC ID program because of its requirement that the city store New Yorkers’ most sensitive documentation, such as pay stubs, bank statements, health records, and even their children’s educational records, for two years. While they are retained by the city, these documents can be accessed by law enforcement, including the NYPD, FBI, DHS and other agencies, without a demonstration of probable cause. Because it does not meet constitutional privacy standards, we cannot support the current iteration of the NYC ID program.
Recommendations
1. Notice & Confidentiality
There are means to establish and strengthen privacy protections in the NYC ID program through the rulemaking process. The first of these is the establishment of a notice requirement that will be triggered by a law enforcement request for a cardholder’s information. This is a basic privacy protection that exists in many local agency rules already.
The Human Resources Agency (HRA) should provide notice to any cardholder whose documentation is requested by any entity outside of HRA. As an example, a similar provision exists in regulations governing the Department of Social Services, with respect to requests for public assistance records.1 Under these regulations, the Department of Social Services must seek the permission of any individual whose records are requested, even when that information is being subpoenaed by a court.
The Social Services rules also direct the department to plead confidentiality in response to any subpoena for records where the purpose is not directly related to the administration of public assistance or to the welfare of a child. Similarly, the NYC ID rules could direct HRA to plead confidentiality in any case where permission is not granted and the request is not related to the administration of the NYC ID itself. By sharing their personal information with HRA, participants in NYC ID are implicitly agreeing to allow HRA to use that information in the program’s administration. But most applicants will have no idea that the City may share their information without consulting or even notifying them.
The failure of the current rules to include any provisions for notice is a breach of the public’s trust. Without a notice requirement, the City is stripping NYC ID cardholders of their ability to defend their own privacy against requests by agencies including Immigrations and Customs Enforcement and the Department of Homeland Security. This leaves the City responsible for defending the privacy interests of the program’s participants, of which there will likely be at least tens of thousands. Without a requirement that HRA plead confidentiality in response to requests outside of its responsibility for administering the program, the City leaves cardholders’ information vulnerable to an ever-expanding universe of requestors who obtain court orders. Some of these requestors will be serving purposes directly at odds with NYC ID’s goal of protecting vulnerable populations.
2. Redaction
Second, we recommend that the rules include provisions for redacting information that is irrelevant to the administration of the NYC ID program from applicants’ duplicated documents before they are stored. The documents at issue contain some of most sensitive and private information that people have in their possession, information that people take the greatest care to protect and preserve from disclosure to third parties, including family members and close friends. This includes information like education and health records that are protected by federal law. Not coincidentally, these are documents that law enforcement would never have legal access to without a court order— except through the NYC ID program.
The City can make a serious commitment to protecting that information by directing HRA to redact information contained in a document that is not relevant to the reason for which the document was proffered. For example, if an applicant provides a hospital record as proof of residency, all information beyond the person’s name and address, and any letterhead or insignia that authenticates the document, should be redacted from the copy in the city’s possession. This will prevent unnecessary information leakage that could put things like special education status or a health diagnoses into the hands of law enforcement without a judicial warrant. Redaction is an inexpensive strategy that will not harm the City’s goal of carefully monitoring for fraud but will make enormous difference in the security and sanctity of private information.
3. Facial Recognition Software
Finally, we understand that the administration intends to adopt facial recognition technology in administering NYC ID. We are concerned about the complete lack of proposed regulations regarding the use of this technology. Facial recognition technology is not immune to error,2 and can be easily abused. For these reasons we strongly suggest the following three regulations be adopted as minimum protections regarding this technology.
First, we recommend the rules limit both the use of the technology and the retention of photos. Specifically, facial recognition technology must be used only at the time of application for the card. The photos and results of any matches should be subject to at least the same privacy protections and retention schedule as the other information in the program. In keeping with the character of a program designed to benefit undocumented people and other marginalized New Yorkers, the rules should explicitly state that the program will limit photo comparisons to other photos in the NYC ID database and not permit comparisons with FBI, NYPD, or any other local, state, or national databases.
Second, the city should provide notice to NYC ID card applicants that facial recognition software is being used in administration of the program. Before an applicant’s photo is taken, HRA should explicitly state that facial recognition software is being used, explain what data is being collected, how the city will store and use that data, and whether it can be accessed or shared with any other agencies. As is the case with other aspects of this program, New Yorkers have a right to know when they are taking a privacy risk.
Finally, the rules should create a transparent and meaningful appeals process for people who believe they were mistakenly matched through the facial recognition process. Not only has the FBI experienced a 20% error rate with its use of facial recognition technology,3 experts on the software have voiced concerns that databases of photos or biometric data may be susceptible to breaches and hacking.4 Given these potential avenues for error, an accessible appeals process is critical to remedy the likely instance of mistaken matches.
Conclusion
NYC ID’s lack of privacy protections creates unacceptable risk for too many potential applicants. Combined with the City’s failure to create notice provisions in the administration of this program, these lapses mean that vulnerable New Yorkers will likely not know when and how their information is being used, stored, and shared by City agencies. The NYCLU encourages the City to consider the rulemaking process as a way to improve and strengthen this program through adopting commonsense privacy protections. We recommend the City give notice before applicants’ documents are disclosed, direct HRA to plead confidentiality if served with a subpoena, redact information not necessary for verifying residency or identity, and articulate specific privacy protections regarding image retention and how photos captured by the program will be kept private. We urge the City to refocus its implementation of this program on the vulnerable communities it was intended to serve, rather than allowing NYC ID to become a tool for law enforcement. By adopting simple privacy protections, it can make important steps in that direction.
Footnotes
1 Department of Social Services Rules, 18 NYCRR §357.3(f)(1)-(3)
2 See e.g., J.D. Tuccille, “Wrong Person May Be Identified 20 Percent of the Time With Facial Recognition Software,” Reason, Oct. 8, 2013, available at http://reason.com/blog/2013/10/08/wrong-person-may-be-identified-20-percen; Meghan E. Irons, “Man Sues Registry After License Mistakenly Revoked,” The Boston Globe, Jul. 17, 2011, available at http://www.boston.com/news/local/massachusetts/articles/2011/07/17/man_sues_registry_after_license_mistakenly_revoked/.
3 See Reason, supra note 2.
4 Federal Trade Commission, FACING FACTS: BEST PRACTICES FOR COMMON USES OF FACIAL RECOGNITION TECHNOLOGIES, Oct. 2012, at 7.