Bill Text: HI HB1607 | 2024 | Regular Session | Introduced


Bill Title: Relating To Algorithmic Discrimination.

Spectrum: Partisan Bill (Democrat 13-0)

Status: (Introduced) 2024-01-24 - Referred to HET/JHA, referral sheet 1 [HB1607 Detail]

Download: Hawaii-2024-HB1607-Introduced.html

HOUSE OF REPRESENTATIVES

H.B. NO.

1607

THIRTY-SECOND LEGISLATURE, 2024

 

STATE OF HAWAII

 

 

 

 

 

 

A BILL FOR AN ACT

 

 

relating to Algorithmic Discrimination.

 

 

BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF HAWAII:

 


     SECTION 1.  The Hawaii Revised Statutes is amended by adding a new chapter to be appropriately designated and to read as follows:

"Chapter

ALGORITHMIC DISCRIMINATION

     §   -1  Definitions.  As used in this chapter:

     "Adverse action" means a denial, cancellation, or other adverse change or assessment regarding an individual's eligibility for, opportunity to access, or terms of access to important life opportunities.

     "Algorithmic eligibility determination" means a determination based in whole or in significant part on an algorithmic process that utilizes machine learning, artificial intelligence, or similar techniques to determine an individual's eligibility for, or opportunity to access, important life opportunities.

     "Algorithmic information availability determination" means a determination based in whole or in significant part on an algorithmic process that utilizes machine learning, artificial intelligence, or similar techniques to determine an individual's receipt of advertising, marketing, solicitations, or offers for an important life opportunity.

     "Covered entity" means any individual, firm, corporation, partnership, cooperative, association, or any other organization, legal entity, or group of individuals however organized, including entities related by common ownership or corporate control, that either makes algorithmic eligibility determinations or algorithmic information availability determinations, or relies on algorithmic eligibility determinations or algorithmic information availability determinations supplied by a service provider, and that meets one or more of the following criteria:

     (1)  Possesses or controls personal information on more than twenty-five thousand residents of the State;

     (2)  Has more than $15,000,000 in average annualized gross receipts for the three years preceding the most recent fiscal year;

     (3)  Is a data broker, or other entity, that derives fifty per cent or more of its annual revenue by collecting, assembling, selling, distributing, providing access to, or maintaining personal information, and some proportion of the personal information concerns a resident of the State who is not a customer or an employee of that entity; or

     (4)  Is a service provider.

     "Important life opportunities" means access to, approval for, or offer of credit, insurance, education, employment, housing, or place of public accommodation as defined in section 489-2.

     "Personal information" means any information held by a covered entity, regardless of how the information is collected, inferred, derived, created, or obtained, that is linked or reasonably linkable to an individual, household, or personal device.  "Personal information" includes but is not limited to:

     (1)  Individually identifiable information such as a real name, alias, signature, date of birth, union membership number, postal address, unique personal identifier, online identifier, internet protocol address, media access control address, unique device identifier, email address, phone number, account name, social security number, military identification number, driver's license number, vehicle identification number, passport number, or other similar identifiers;

     (2)  A person's race, national origin, religious affiliation, gender identity, sexual orientation, marital status, or disability;

     (3)  Commercial information, including records of personal property; products or services purchased, obtained, or considered; or other purchasing or consuming histories or tendencies;

     (4)  Real-time historical geolocation data more specific than a fifty-mile radius;

     (5)  Education records, as defined in title 34, Code of Federal Regulations section 99.3 or any successor regulation;

     (6)  Biometric data, including voice signatures, facial geometry, fingerprints, and retina or iris scans; and

     (7)  Inferences drawn from any of the information identified in paragraphs (1) through (6) to create a profile about an individual reflecting the individual's predispositions, behavior, habits, attitudes, intelligence, abilities, and aptitudes.

     "Reasonably linkable to an individual, household, or personal device" means personal information that can be used on its own or in combination with other information reasonably available to the covered entity, regardless of whether the other information is held by the covered entity, to identify an individual, household, or personal device.

     "Service provider" means any entity that performs algorithmic eligibility determinations or algorithmic information availability determinations on behalf of another entity.

     §   -2  Prohibited practices; exemptions.  (a)  A covered entity shall not make an algorithmic eligibility determination or an algorithmic information availability determination on the basis of an individual's or class of individuals' actual or perceived race, color, religion, national origin, sex, gender identity or expression, sexual orientation, familial status, source of income, or disability in a manner that segregates, discriminates against, or otherwise makes important life opportunities unavailable to an individual or class of individuals.

     (b)  Any practice that has the effect or consequence of violating subsection (a) shall be deemed to be an unlawful discriminatory practice.

     (c)  Nothing in subsection (a) shall prohibit covered entities from using individuals' personal information as part of an affirmative action plan adopted pursuant to state or federal law.

     §   -3  Relationships with service providers.  Any covered entity that relies in whole or in part on a service provider to conduct an algorithmic eligibility determination or an algorithmic information availability determination shall require by written agreement that the service provider implement and maintain measures reasonably designed to ensure that the service provider complies with this chapter.

     §   -4  Right to notice and disclosure.  (a)  A covered entity shall:

     (1)  Develop a notice that explains how the covered entity uses personal information in algorithmic eligibility determinations and algorithmic information availability determinations, including:

          (A)  What personal information the covered entity collects, generates, infers, uses, and retains;

          (B)  What sources the covered entity uses to collect, generate, or infer personal information;

          (C)  Whether the personal information is shared, sold, leased, or exchanged with any service providers for any kind of consideration, and if so, the names of those service providers, including subsidiaries of the service providers;

          (D)  A brief description of the relationship between the personal information and the algorithmic eligibility or algorithmic information availability determinations;

          (E)  How long the covered entity will hold the personal information; and

          (F)  The rights provided under this chapter;

     (2)  Ensure that the notice developed and made available under paragraph (1) of this subsection:

          (A)  Is clear, concise, and complete;

          (B)  Does not contain unrelated, confusing, or contradictory materials; and

          (C)  Is in a format that is:

              (i)  Prominent and easily accessible;

             (ii)  Capable of fitting on one printed page; and

            (iii)  Provided in English, as well as in any non-English language spoken by at least five hundred individuals in the State population;

     (3)  Within thirty days after changing its collection or use practices or policies in a way that affects the content of the notice required by paragraph (1) of this subsection, update that notice;

     (4)  Make the notice required under paragraph (1) of this subsection continuously and conspicuously available:

          (A)  On the covered entity's website or mobile application, if the covered entity maintains a website or mobile application; and

          (B)  At the physical place of business or any offline equivalent the covered entity maintains; and

     (5)  Send the notice required under paragraph (1) of this subsection to an individual before the first algorithmic information availability determination it makes about the individual by:

          (A)  Mail, if the personal information was gathered through the individual contacting or contracting with the covered entity through mail;

          (B)  Email, if the personal information was gathered through the individual contacting or contracting with the covered entity through email, or if the covered entity has the individual's email address for another reason;

          (C)  Informing individuals through a "pop-up" notification upon navigation to the covered entity's website or within the covered entity's mobile application; or

          (D)  Providing a clear and conspicuous link on the covered entity's website's homepage, or the home screen of its mobile application, leading to the notice.

     (b)  A covered entity need not provide the notice described under subsection (a) of this section if another covered entity has provided notice to the same individual for the same action as part of a contracted arrangement with the covered entity.

     (c)  A covered entity that is subject to subsection (a)(1), with respect to any individual whose personal information the covered entity holds as described in that subsection, shall not use any personal information of the individual in an algorithmic eligibility determination unless the covered entity has provided the individual with notice consistent with that subsection.

     (d)  If a covered entity takes any adverse action with respect to any individual that is based in whole or in part on the results of an algorithmic eligibility determination, the covered entity shall provide the individual a written or electronic disclosure that includes:

     (1)  The covered entity's name, address, email address, and telephone number;

     (2)  The factors the determination depended on; and

     (3)  An explanation that the individual may:

          (A)  Access any personal information pertaining to that individual that the covered entity used to make the determination;

          (B)  Submit corrections to that information; and

          (C)  If the individual submits corrections, request that the covered entity conduct a reasoned reevaluation of the relevant algorithmic eligibility determination, conducted by a human, based on the corrected data.

     §   -5  Auditing for discriminatory processing and reporting requirement.  (a)  A covered entity shall annually audit its algorithmic eligibility determination and algorithmic information availability determination practices to:

     (1)  Determine whether the processing practices discriminate in a manner prohibited under    -2;

     (2)  Analyze disparate-impact risks of algorithmic eligibility determinations and algorithmic information availability determinations based on actual or perceived race, color, religion, national origin, sex, gender identity or expression, sexual orientation, familial status, genetic information, source of income, or disability;

     (3)  Create and retain for at least five years an audit trail that records, for each algorithmic eligibility determination:

          (A)  The type of algorithmic eligibility determination made;

          (B)  The data used in the determination, including the source of the data;

          (C)  The methodology used by the entity to establish the algorithm;

          (D)  The algorithm used to make the determination;

          (E)  Any data or sets of data used to train the algorithm;

          (F)  Any testing and results for model performance across different subgroups or for discriminatory effects;

          (G)  The methodology used to render the determination; and

          (H)  The ultimate decision rendered;

     (4)  Conduct annual impact assessments of:

          (A)  Existing systems that render algorithmic eligibility determinations and algorithmic information availability determinations; and

          (B)  Prior to implementation, new systems that render algorithmic eligibility determinations and algorithmic information availability determinations;

     (5)  Conduct the audits under paragraphs (1), (2), and (3) of this subsection in consultation with third parties who have substantial information about or participated in the covered entity's algorithmic eligibility determinations and algorithmic information availability determinations, including service providers; and

     (6)  Identify and implement reasonable measures to address risks of an unlawful disparate impact identified in the audits and impact assessments conducted under paragraphs (1), (2), and (3) of this subsection, including the risks posed by determinations made by the covered entity's service providers.

     (b)  A covered entity shall annually submit a report containing the results of the audit mandated under this section to the department of the attorney general on a form provided by the department of the attorney general.  The report shall contain the following information:

     (1)  The types of algorithmic eligibility determinations and algorithmic information availability determinations that the covered entity makes;

     (2)  The data and methodologies that the covered entity uses to establish the algorithms;

     (3)  The optimization criteria of the algorithms used to make the determinations;

     (4)  Any data or sets of data used to train the algorithms, and the source or sources of the data;

     (5)  The methodologies the covered entity uses to render the determinations;

     (6)  Any performance metrics the entity uses to gauge the accuracy of the assessments, including accuracy, confidence intervals, and how those assessments are obtained;

     (7)  The frequency, methodology, and results of the impact assessments or risk assessments that the entity has conducted;

     (8)  Within the description of each of the decisions in paragraphs (1) through (7), the rationale for each decision;

     (9)  Whether the covered entity has received complaints from individuals regarding the algorithmic eligibility determinations and algorithmic information availability determinations it has made; and

    (10)  If the covered entity has determined that one or more of the exemptions referred to in section    -2(c) apply to practices that would otherwise violate section    -2(a), a declaration and explanation of the covered entity's reliance on those exemptions.

     (c)  To the extent consistent with federal law or state law, a covered entity may, in place of the report required by subsection (a), submit to the department of the attorney general a report previously submitted to a federal, state, or other government entity, if that report contains the required information or is supplemented with missing information.

     (d)  The attorney general may adopt rules pursuant to chapter 91 necessary to implement the reporting provisions of this section.

     §   -6  Enforcement; penalties.  (a)  In any case in which the attorney general has reason to believe that any person has used, is using, or intends to use any method, act, or practice in violation of this chapter or rule adopted under this chapter, or has failed to provide a notice, a disclosure, or a report required by this chapter, the attorney general may commence appropriate civil action for:

     (1)  A temporary or permanent injunction;

     (2)  Penalties as described in subsection (c) of this section;

     (3)  Damages or restitution; or

     (4)  Any other relief that the court considers appropriate.

     (b)  In the course of an investigation to determine whether to seek relief, the attorney general may subpoena witnesses; administer oaths; examine an individual under oath; require sworn written responses to written questions; and compel production of records, books, papers, contracts, and other documents.

     (c)  Any covered entity or service provider that violates this chapter shall be liable for a civil penalty of not more than $10,000 for each violation, which may be recovered in a civil action brought by the attorney general.

     (d)  Any civil penalty assessed for a violation of this chapter, and the proceeds of any settlement of an action brought pursuant to this section, shall be deposited in the litigation deposits trust account under section 28-16.

     (e)  Any person aggrieved by a violation of this chapter may bring a civil action in any court of competent jurisdiction, and the court may award an amount not less than $100 and not greater than $10,000 per violation or actual damages, whichever is greater.

     (f)  In a civil action brought under either subsection (c) or (e) of this section in which the plaintiff prevails, the court may also award:

     (1)  Punitive damages;

     (2)  Reasonable attorney's fees and litigation costs; and

     (3)  Any other relief, including equitable or declaratory relief, that the court determines appropriate.

     (g)  In a civil action brought under subsection (e) of this section, a violation of this chapter or a rule adopted under this chapter with respect to an individual constitutes a concrete and particularized injury to that individual."

     SECTION 2.  This Act shall take effect upon its approval.

 

INTRODUCED BY:

_____________________________

 

 


 


 

Report Title:

Department of the Attorney General; Algorithmic Discrimination; Artificial Intelligence

 

Description:

Prohibits users of algorithmic decision-making from utilizing algorithmic eligibility determinations in a discriminatory manner.  Requires users of algorithmic decision-making to send corresponding notices to individuals whose personal information is used.  Requires users of algorithmic decision-making to submit annual reports to the Department of the Attorney General.  Provides for appropriate means of civil enforcement.

 

 

 

The summary description of legislation appearing on this page is for informational purposes only and is not legislation or evidence of legislative intent.

feedback