Comments to the OAG's office re: Stop Addictive Feeds Exploitation (SAFE) for Kids Act
Civil Liberties Union
Testimony Before the New York Assembly Standing Committee on Consumer Affairs and Protection and the New York Assembly Standing Committee on Science and Technology
AI and ADS broadly – software tools or processes that automate, replace, or aid human decision-making – are widely used to administer services, allocate resources, tailor offerings or customize products, and make inferences about individuals, groups, or places. Whether across government agencies or in private businesses, their ubiquity and opaque deployment risk severely undermining the civil, human, and privacy rights of New Yorkers. The use of ADS is often accompanied by an acute power imbalance between those deploying these systems and those affected by them, particularly given that ADS operate without transparency or even the most basic legal protections.
The New York State Legislature must act to provide meaningful transparency and accountability to ADS and ensure they do not digitally circumvent New York’s laws against discrimination. Any regulation must cover ADS broadly, mandate comprehensive and impartial impact assessments, require transparency and clear notice to affected people, and provide opportunities to contest the results of such tools as well as viable paths to request reasonable accommodations. New Yorkers should not need to worry about being screened by a discriminatory algorithm when applying for housing, work, or credit; they shouldn’t have to fear faulty software tools affecting their health care or education; and they should not be offered different opportunities or choices based on their demographics.
To achieve these goals, we provide the Digital Fairness Act, A.3308/S.2277; the Bossware and Oppressive Technology Act (BOT Act), A.9315-A/S.7623-B; and the NY Department of Financial Services AI Circular Letter as exemplary frameworks for consideration by the Legislature as it engages further on issues related to AI and ADS.