STATE OF NEW YORK
________________________________________________________________________
8209
IN SENATE
January 12, 2024
___________
Introduced by Sen. COONEY -- read twice and ordered printed, and when
printed to be committed to the Committee on Internet and Technology
AN ACT to amend the state technology law, in relation to enacting the
New York artificial intelligence bill of rights
The People of the State of New York, represented in Senate and Assem-
bly, do enact as follows:
1 Section 1. Short title. This act shall be known and may be cited as
2 the "New York artificial intelligence bill of rights".
3 § 2. Legislative intent. This legislature hereby finds that this
4 generation of humans is the first in history to have the ability to
5 create technologies that can make decisions which previously could have
6 only been made by humans. States and countries across the world are
7 grappling with critical questions of how we can use these technologies
8 to solve our problems, how we can avoid or manage the new problems that
9 these technologies may create, and how we can control these powerful
10 technologies.
11 Therefore, the legislature declares that any New York resident
12 affected by any system making decisions without human intervention be
13 entitled to certain rights and protections to ensure that the system
14 impacting their lives do so lawfully, properly, and with meaningful
15 oversight.
16 Among these rights and protections are (i) the right to safe and
17 effective systems; (ii) protections against algorithmic discrimination;
18 (iii) protections against abusive data practices; (iv) the right to have
19 agency over one's data; (v) the right to know when an automated system
20 is being used; (vi) the right to understand how and why an automated
21 system contributed to outcomes that impact one; (vii) the right to opt
22 out of an automated system; and (viii) the right to work with a human in
23 the place of an automated system.
24 The legislature also finds that automated systems will continue to be
25 developed and evolve both within the state and outside the state. It is
26 therefore critical that New York does not overburden the development of
27 innovative systems that better the state and its residents, nor drive
28 the development of such systems to foreign states or countries with less
EXPLANATION--Matter in italics (underscored) is new; matter in brackets
[ ] is old law to be omitted.
LBD13117-01-3
S. 8209 2
1 appropriate regulation, nor threaten the security of our state, country,
2 and its people.
3 To these ends, the legislature declares that the white paper published
4 by the White House Office of Science and Technology titled "Blueprint
5 for an AI Bill of Rights" in October of 2022 is commensurate with the
6 goals of this state in relation to artificial intelligence.
7 § 3. The state technology law is amended by adding a new article IV to
8 read as follows:
9 ARTICLE IV
10 ARTIFICIAL INTELLIGENCE BILL OF RIGHTS
11 Section 401. Definitions.
12 402. Application.
13 403. Construction.
14 404. Safe and effective systems.
15 405. Algorithmic discrimination practices.
16 406. Data privacy.
17 407. Notice and explanation.
18 408. Human alternatives, consideration, and fallback.
19 409. Penalties; no private cause of action.
20 § 401. Definitions. As used in this article, the following terms shall
21 have the following meanings:
22 1. "Civil rights, civil liberties, and privacy" or "rights, opportu-
23 nity, and access" means such rights and protections provided for in the
24 United States Constitution, federal law, the laws and constitution of
25 the state of New York, and privacy and other freedoms that exist in both
26 the public and private sector contexts, which shall include, but shall
27 not be limited to:
28 (a) freedom of speech;
29 (b) voting rights;
30 (c) protections from discrimination;
31 (d) protections from excessive or unjust punishment; and
32 (e) protections from unlawful surveillance.
33 2. "Equal opportunity" means equal access to education, housing, cred-
34 it, employment, and other programs.
35 3. "Access to critical resources or services" means such resources and
36 services that are fundamental for the well-being, security, and equita-
37 ble participation of New York residents in society, which shall include,
38 but shall not be limited to:
39 (a) healthcare;
40 (b) financial services;
41 (c) safety;
42 (d) social services;
43 (e) non-deceptive information about goods and services; and
44 (f) government benefits.
45 4. "Algorithmic discrimination" means circumstances where an automated
46 system contributes to an unjustified different treatment or impact which
47 disfavors people based on their age, color, creed, disability, domestic
48 violence victim status, gender identity or expression, familial status,
49 marital status, military status, national origin, predisposing genetic
50 characteristics, pregnancy-related condition, prior arrest or conviction
51 record, race, sex, sexual orientation, or veteran status or any other
52 classification protected by law.
53 5. "Automated system" means any system, software, or process that
54 affects New York residents and that uses computation as a whole or part
55 of a system to determine outcomes, make or aid decisions, inform policy
S. 8209 3
1 implementation, collect data or observations, or otherwise interact with
2 New York residents or communities. Automated systems shall include, but
3 not be limited to, systems derived from machine learning, statistics, or
4 other data processing or artificial intelligence techniques, and shall
5 exclude passive computing infrastructure.
6 6. "Passive computing infrastructure" shall include any intermediary
7 technology that does not influence or determine the outcome of deci-
8 sions, make or aid in decisions, inform policy implementation, or
9 collect data or observations, including web hosting, domain registra-
10 tion, networking, caching, data storage, or cybersecurity.
11 7. "Communities" means neighborhoods, social network connections,
12 families, people connected by affinity, identity, or shared traits and
13 formal organizational ties. This includes Tribes, Clans, Bands, Ranche-
14 rias, Villages, and other Indigenous communities.
15 8. "Social network" means any connection of persons which exists
16 online or offline.
17 9. "Families" means any relationship, whether by blood, choice, or
18 otherwise, where one or more persons assume a caregiver role, primary or
19 shared, for one or more others, or where individuals mutually support
20 and are committed to each other's well-being.
21 10. "Equity" means the consistent and systematic fair, just, and
22 impartial treatment of all New York residents. Systemic, fair, and just
23 treatment shall take into account the status of New York residents who
24 belong to underserved communities that have been denied such treatment,
25 such as Black, Latino, and Indigenous and Native American persons, Asian
26 Americans and Pacific Islanders and other persons of color; members of
27 religious minorities; women, girls, and non-binary people; lesbian, gay,
28 bisexual, transgender, queer, and intersex persons; older adults;
29 persons with disabilities; persons who live in rural areas; and persons
30 otherwise adversely affected by persistent poverty or inequality.
31 11. "Sensitive data" means any data and metadata:
32 (a) that pertains to a New York resident in a sensitive domain;
33 (b) that are generated by technologies in a sensitive domain;
34 (c) that can be used to infer data from a sensitive domain;
35 (d) about a New York resident, such as disability-related data, genom-
36 ic data, biometric data, behavioral data, geolocation data, data related
37 to the criminal justice system, relationship history, or legal status
38 such as custody and divorce information, and home, work, or school envi-
39 ronmental data;
40 (e) that has the reasonable potential to be used in ways that are
41 likely to expose New York residents to meaningful harm, such as a loss
42 of privacy or financial harm due to identity theft; or
43 (f) that is generated by a person under the age of eighteen.
44 12. "Sensitive domain" means a particular area, field, or sphere of
45 activity in which activities being conducted can cause material harms,
46 including significant adverse effects on human rights such as autonomy
47 and dignity, as well as civil liberties and civil rights.
48 13. "Surveillance technology" means products or services marketed for
49 or that can be lawfully used to detect, monitor, intercept, collect,
50 exploit, preserve, protect, transmit, or retain data, identifying infor-
51 mation, or communications concerning New York residents or groups.
52 14. "Underserved communities" means communities that have been system-
53 atically denied a full opportunity to participate in aspects of econom-
54 ic, social, and civic life.
55 § 402. Application. The rights contained within this article shall be
56 construed as applying to New York residents against persons developing
S. 8209 4
1 automated systems that have the potential to meaningfully impact New
2 York residents':
3 1. civil rights, civil liberties, and privacy;
4 2. equal opportunities; or
5 3. access to critical resources or services.
6 § 403. Construction. The rights contained within this article shall be
7 construed as harmonious and mutually supportive.
8 § 404. Safe and effective systems. 1. New York residents have the
9 right to be protected from unsafe or ineffective automated systems.
10 These systems must be developed in collaboration with diverse communi-
11 ties, stakeholders, and domain experts to identify and address any
12 potential concerns, risks, or impacts.
13 2. Automated systems shall undergo pre-deployment testing, risk iden-
14 tification and mitigation, and shall also be subjected to ongoing moni-
15 toring that demonstrates they are safe and effective based on their
16 intended use, mitigation of unsafe outcomes including those beyond the
17 intended use, and adherence to domain-specific standards.
18 3. If an automated system fails to meet the requirements of this
19 section, it shall not be deployed or, if already in use, shall be
20 removed. No automated system shall be designed with the intent or a
21 reasonably foreseeable possibility of endangering the safety of any New
22 York resident or New York communities.
23 4. Automated systems shall be designed to proactively protect New York
24 residents from harm stemming from unintended, yet foreseeable, uses or
25 impacts.
26 5. New York residents are entitled to protection from inappropriate or
27 irrelevant data use in the design, development, and deployment of auto-
28 mated systems, and from the compounded harm of its reuse.
29 6. Independent evaluation and reporting that confirms that the system
30 is safe and effective, including reporting of steps taken to mitigate
31 potential harms, shall be performed and the results made public whenever
32 possible.
33 § 405. Algorithmic discrimination practices. 1. No New York resident
34 shall face discrimination by algorithms, and all automated systems shall
35 be used and designed in an equitable manner.
36 2. The designers, developers, and deployers of automated systems shall
37 take proactive and continuous measures to protect New York residents and
38 communities from algorithmic discrimination, ensuring the use and design
39 of these systems in an equitable manner.
40 3. The protective measures required by this section shall include
41 proactive equity assessments as part of the system design, use of repre-
42 sentative data, protection against proxies for demographic features, and
43 assurance of accessibility for New York residents with disabilities in
44 design and development.
45 4. Automated systems shall undergo pre-deployment and ongoing dispari-
46 ty testing and mitigation, under clear organizational oversight.
47 5. Independent evaluations and plain language reporting in the form of
48 an algorithmic impact assessment, including disparity testing results
49 and mitigation information, shall be conducted for all automated
50 systems.
51 6. New York residents shall have the right to view such evaluations
52 and reports.
53 § 406. Data privacy. 1. New York residents shall be protected from
54 abusive data practices via built-in protections and shall maintain agen-
55 cy over the use of their personal data.
S. 8209 5
1 2. Privacy violations shall be mitigated through design choices that
2 include privacy protections by default, ensuring that data collection
3 conforms to reasonable expectations and that only strictly necessary
4 data for the specific context is collected.
5 3. Designers, developers, and deployers of automated systems must seek
6 and respect the decisions of New York residents regarding the
7 collection, use, access, transfer, and deletion of their data in all
8 appropriate ways and to the fullest extent possible. Where not possible,
9 alternative privacy by design safeguards must be implemented.
10 4. Automated systems shall not employ user experience or design deci-
11 sions that obscure user choice or burden users with default settings
12 that are privacy-invasive.
13 5. Consent shall be used to justify the collection of data only in
14 instances where it can be appropriately and meaningfully given. Any
15 consent requests shall be brief, understandable in plain language, and
16 provide New York residents with agency over data collection and its
17 specific context of use.
18 6. Any existing practice of complex notice-and-choice for broad data
19 use shall be transformed, emphasizing clarity and user comprehension.
20 7. Enhanced protections and restrictions shall be established for data
21 and inferences related to sensitive domains. In sensitive domains, indi-
22 vidual data and related inferences may only be used for necessary func-
23 tions, safeguarded by ethical review and use prohibitions.
24 8. New York residents and New York communities shall be free from
25 unchecked surveillance; surveillance technologies shall be subject to
26 heightened oversight, including at least pre-deployment assessment of
27 their potential harms and scope limits to protect privacy and civil
28 liberties.
29 9. Continuous surveillance and monitoring shall not be used in educa-
30 tion, work, housing, or any other contexts where the use of such
31 surveillance technologies is likely to limit rights, opportunities, or
32 access.
33 10. Whenever possible, New York residents shall have access to report-
34 ing that confirms respect for their data decisions and provides an
35 assessment of the potential impact of surveillance technologies on their
36 rights, opportunities, or access.
37 § 407. Notice and explanation. 1. New York residents shall be informed
38 when an automated system is in use and New York residents shall be
39 informed how and why the system contributes to outcomes that impact
40 them.
41 2. Designers, developers, and deployers of automated systems shall
42 provide accessible plain language documentation, including clear
43 descriptions of the overall system functioning, the role of automation,
44 notice of system use, identification of the individual or organization
45 responsible for the system, and clear, timely, and accessible explana-
46 tions of outcomes.
47 3. The provided notice shall be kept up-to-date, and New York resi-
48 dents impacted by the system shall be notified of any significant chang-
49 es to use cases or key functionalities.
50 4. New York residents shall have the right to understand how and why
51 an outcome impacting them was determined by an automated system, even
52 when the automated system is not the sole determinant of the outcome.
53 5. Automated systems shall provide explanations that are technically
54 valid, meaningful to the individual and any other persons who need to
55 understand the system and proportionate to the level of risk based on
56 the context.
S. 8209 6
1 6. Summary reporting, including plain language information about these
2 automated systems and assessments of the clarity and quality of notice
3 and explanations, shall be made public whenever possible.
4 § 408. Human alternatives, consideration, and fallback. 1. New York
5 residents shall have the right to opt out of automated systems, where
6 appropriate, in favor of a human alternative. The appropriateness of
7 such an option shall be determined based on reasonable expectations in a
8 given context, with a focus on ensuring broad accessibility and protect-
9 ing the public from particularly harmful impacts. In some instances, a
10 human or other alternative may be mandated by law.
11 2. New York residents shall have access to a timely human consider-
12 ation and remedy through a fallback and escalation process if an auto-
13 mated system fails, produces an error, or if they wish to appeal or
14 contest its impacts on them.
15 3. The human consideration and fallback process shall be accessible,
16 equitable, effective, maintained, accompanied by appropriate operator
17 training, and should not impose an unreasonable burden on the public.
18 4. Automated systems intended for use within sensitive domains,
19 including but not limited to criminal justice, employment, education,
20 and health, shall additionally be tailored to their purpose, provide
21 meaningful access for oversight, include training for New York residents
22 interacting with the system, and incorporate human consideration for
23 adverse or high-risk decisions.
24 5. Summary reporting, which includes a description of such human
25 governance processes and an assessment of their timeliness, accessibil-
26 ity, outcomes, and effectiveness, shall be made publicly available when-
27 ever possible.
28 § 409. Penalties; no private cause of action. 1. Where an operator of
29 an automated system violates or causes a violation of any of the rights
30 stated within this article, such operator shall be liable to the people
31 of this state for a penalty not less than three times such damages
32 caused.
33 2. The penalty provided for in subdivision one of this section may be
34 recovered by an action brought by the attorney general in any court of
35 competent jurisdiction.
36 3. Nothing set forth in this article shall be construed as creating,
37 establishing, or authorizing a private cause of action by an aggrieved
38 person against an operator of an automated system who has violated, or
39 is alleged to have violated, any provision of this article.
40 § 4. This act shall take effect on the ninetieth day after it shall
41 have become a law.