Bill Text: CA AB1394 | 2023-2024 | Regular Session | Chaptered


Bill Title: Commercial sexual exploitation: child sexual abuse material: civil actions.

Spectrum: Moderate Partisan Bill (Democrat 4-1)

Status: (Passed) 2023-10-08 - Chaptered by Secretary of State - Chapter 579, Statutes of 2023. [AB1394 Detail]

Download: California-2023-AB1394-Chaptered.html

Assembly Bill No. 1394
CHAPTER 579

An act to amend Section 3345.1 of, and to add Title 22 (commencing with Section 3273.65 to Part 4 of Division 3 of, the Civil Code, relating to social media platforms.

[ Approved by Governor  October 08, 2023. Filed with Secretary of State  October 08, 2023. ]

LEGISLATIVE COUNSEL'S DIGEST


AB 1394, Wicks. Commercial sexual exploitation: child sexual abuse material: civil actions.
Existing law, in a civil action brought by, on behalf of, or for the benefit of a minor, or nonminor dependent, against a person who engaged in an act of commercial sexual exploitation, as defined, of a minor or nonminor dependent, authorizes the trier of fact to either impose a fine, civil penalty, or other penalty, or other remedy in an amount up to 3 times greater than authorized by the statute or to award a civil penalty not exceeding $50,000 and not less than $10,000 for each act of commercial sexual exploitation committed by the defendant if the tier of fact makes an affirmative finding regarding certain factors, including whether the defendant’s conduct was directed to more than one minor or nonminor dependent, as prescribed.
This bill would, beginning January 1, 2025, prohibit a social media platform, as defined, from knowingly facilitating, aiding, or abetting commercial sexual exploitation, as specified. The bill would require a court to award statutory damages not exceeding $4,000,000 and not less than $1,000,000 for each act of commercial sexual exploitation facilitated, aided, or abetted by the social media platform. The bill would define “facilitate, aid, or abet” to mean to deploy a system, design, feature, or affordance that is a substantial factor in causing minor users to be victims of commercial sexual exploitation. The bill would prohibit a social media platform from being deemed to be in violation of this provision if it demonstrates certain mitigating facts, including that the social media platform instituted and maintained a program of at least biannual audits of its designs, algorithms, practices, affordances, and features to detect designs, algorithms, practices, affordances, or features that have the potential to cause or contribute to violations of that provision, as prescribed.
Existing law, the California Consumer Privacy Act of 2018, grants to a consumer various rights with respect to personal information, as defined, that is collected by a business, as defined, including the right to request that a business delete personal information about the consumer that the business has collected from the consumer.
This bill would, beginning January 1, 2025, require a social media platform to, among other things, provide, in a mechanism that is reasonably accessible to users, a means for a user who is a California resident to report material to the social media platform that the user reasonably believes meets certain criteria, including that the reported material is child sexual abuse material, as defined, in which the reporting user is an identifiable minor depicted in the reported material.
This bill would, beginning January 1, 2025, require the social media platform to permanently block the instance of reported material and make reasonable efforts to remove and block other instances of the same reported material from being viewable on the social media platform, as prescribed, if there is a reasonable basis to believe that the reported material is child sexual abuse material that is displayed, stored, or hosted on the social media platform, and the report contains basic identifying information, as specified, sufficient to permit the social media platform to locate the reported material. The bill would make a violator of those provisions liable to the reporting user, including for statutory damages of no more than $250,000 per violation, $125,000 per violation, or $75,000 per violation, as specified.
Vote: MAJORITY   Appropriation: NO   Fiscal Committee: NO   Local Program: NO  

The people of the State of California do enact as follows:


SECTION 1.

 The Legislature finds and declares all of the following:
(a) Social media platforms frequently facilitate the sexual abuse, exploitation, and trafficking of children. Social media platforms are aware of this issue and have not taken steps sufficient to stop the problem.
(b) A Facebook whistleblower has, in a sworn statement submitted to the United States Securities and Exchange Commission, stated all of the following:
(1) Facebook’s efforts to prevent sexual abuse of children are inadequate and underresourced.
(2) Facebook does not track the full scale of child sexual abuse on its platforms, and executives refuse to spend funds available in order to prioritize “return on investment.”
(3) Facebook’s moderators are not sufficiently trained and are ill-prepared to prevent child sexual abuse.
(4) Facebook’s Groups, which are described as relying on self-policing, are facilitating harm and child sexual abuse because the product design allows sexual predators to use code words to describe the type of child and the type of sexual activity. The predators use Facebook’s encrypted Messenger service or WhatsApp to share these codes, which change routinely.
(c) A Forbes review of hundreds of recent TikTok livestreams revealed how viewers regularly use the comments to urge young girls to perform acts that appear to toe the line of child pornography, rewarding those who oblige with TikTok gifts that can be redeemed for money, or off-platform payments to Venmo, PayPal or Cash App accounts that users list in their TikTok profiles. According to an assistant dean at Harvard Law School and faculty associate at Harvard’s Berkman Klein Center for Internet & Society, “[i]t’s the digital equivalent of going down the street to a strip club filled with 15-year-olds.”

SEC. 2.

 Title 22 (commencing with Section 3273.65) is added to Part 4 of Division 3 of the Civil Code, to read:

TITLE 22. Child Sexual Abuse Material Hosted on a Social Media Platform

3273.65.
 As used in this title:
(a) “Child pornography” has the same meaning as defined in Section 2256 of Title 18 of the United States Code, as amended from time to time.
(b) “Child sexual abuse material” means either of the following:
(1) Child pornography.
(2) Obscene matter that depicts a minor personally engaging in, or personally simulating, sexual conduct.
(c) “Identifiable minor” has the same meaning as defined in Section 2256 of Title 18 of the United States Code, as amended from time to time.
(d) “Minor” has the same meaning as defined in Section 2256 of Title 18 of the United States Code, as amended from time to time.
(e) “Obscene matter” has the same meaning as defined in Section 311 of the Penal Code.
(f) “Reporting user” means a natural person who reports material to a social media platform using the means provided by the social media platform pursuant to Section 3273.61.
(g) (1) “Social media company” has, except as provided in paragraph (2), the same meaning as defined in Section 22675 of the Business and Professions Code.
(2) “Social media company” does not include a nonprofit organization exempt from federal income tax pursuant to Section 501(c)(3) of the Internal Revenue Code.
(h) (1) “Social media platform” has, except as provided in paragraph (2), the same meaning as defined in Section 22675 of the Business and Professions Code.
(2) “Social media platform” does not include either of the following:
(A) A stand-alone direct messaging service that provides end-to-end encrypted communication or the portion of a multiservice platform that uses end-to-end encrypted communication.
(B) An internet-based service or application owned or operated by a nonprofit organization exempt from federal income tax pursuant to Section 501(c)(3) of the Internal Revenue Code.

3273.66.
 A social media platform shall do all of the following:
(a) Provide, in a mechanism that is reasonably accessible to users, a means for a user who is a California resident to report material to the social media platform that the user reasonably believes meets all of the following criteria:
(1) The reported material is child sexual abuse material.
(2) The reporting user is an identifiable minor depicted in the reported material.
(3) The reported material is displayed, stored, or hosted on the social media platform.
(b) Collect information reasonably sufficient to enable the social media platform to contact, pursuant to subdivision (c), a reporting user.
(c) A social media platform shall contact a reporting user in writing by a method, including, but not limited to, a telephone number for purposes of sending text messages, or an email address, that meets both of the following criteria:
(1) The method is chosen by the reporting user.
(2) The method is not a method that is within the control of the social media company that owns or operates the social media platform.
(d) (1) Permanently block the instance of reported material from being viewable on the social media platform if the reported material meets all of the following criteria:
(A) There is a reasonable basis to believe that the reported material is child sexual abuse material.
(B) The reported material is displayed, stored, or hosted on the social media platform.
(C) (i) The report contains basic identifying information, such as an account identifier, sufficient to permit the social media platform to locate the reported material.
(ii) A social media platform shall not require a report to contain a specific piece of information for purposes of this subparagraph.
(2) A social media platform shall make reasonable efforts to remove and block other instances of the same reported material blocked pursuant to this subdivision from being viewable on the social media platform.
(e) Provide written confirmation to a reporting user that the social media platform received that person’s report that meets all of the following criteria:
(1) The written confirmation is provided to the reporting user within 36 hours of when the material was first reported.
(2) The written confirmation is provided using the information collected from the reporting user under subdivision (b).
(3) The written confirmation informs the reporting user of the schedule of regular written updates that the social media platform is required to make under subdivision (f).
(f) (1) Provide a written update to the reporting user as to the status of the social media platform’s handling of the reported material using the information collected from the reporting user under subdivision (b).
(2) The written update required by this subdivision shall be provided seven days after the date on which the written confirmation required under subdivision (e) was provided and every seven days thereafter until the final written determination required by subdivision (g) is provided.
(g) Issue a final written determination to the reporting user, using the information collected from the reporting user under subdivision (b), stating one of the following:
(1) The reported material has been determined to be child sexual abuse material that was displayed, stored, or hosted on the social media platform and has been blocked on the social media platform.
(2) The reported material has been determined not to be child sexual abuse material.
(3) The reported material has been determined not to be displayed, stored, or hosted on the social media platform.
(h) (1) Except as provided in paragraph (2), comply with subdivisions (c) to (g), inclusive, no later than 30 days after the date on which material was first reported pursuant to this section.
(2) (A) If the social media platform cannot comply with subdivisions (c) to (g), inclusive, due to circumstances beyond the reasonable control of the social media platform, the social media platform shall comply with subdivisions (c) to (g), inclusive, no later than 60 days after the date on which material was first reported pursuant to this section.
(B) If this paragraph applies, the social media platform shall promptly provide written notice of the delay, no later than 48 hours from the time the social media platform knew the delay was likely to occur, to the reporting user using the information collected from the reporting user under subdivision (b).

3273.67.
 (a) A social media company that fails to comply with the requirements of this title shall be liable to a reporting user for all of the following:
(1) Any actual damages sustained by the reporting user as a result of the violation.
(2) (A) (i) Subject to clauses (ii) and (iii), statutory damages of no more than two hundred fifty thousand dollars ($250,000) per violation.
(ii) If a social media platform has permanently blocked the instance of the reported material pursuant to subdivision (d) of Section 3273.66 before a complaint is filed for a violation of this title, the maximum statutory damages awarded pursuant to clause (i) shall be one hundred twenty-five thousand dollars ($125,000) per violation.
(iii) If a social media platform meets all of the following requirements, the maximum statutory damages awarded pursuant to clause (i) for a violation of subdivisions (d) to (g), inclusive, of Section 3273.66 shall be seventy-five thousand dollars ($75,000) per violation:
(I) The social media platform registers with, and participates in, the National Center for Missing and Exploited Children’s Take It Down service or its successor.
(II) The social media platform receives updated hash values for identified child sexual abuse material from the National Center for Missing and Exploited Children’s Take It Down service, or its successor, at least once every 36 hours.
(III) Within 36 hours of receiving updated hash values for identified child sexual abuse material from the National Center for Missing and Exploited Children’s Take it Down service, or its successor, pursuant to subclause (II), the social media platform removes child sexual abuse material identified by hash values from the social media platform.
(IV) The social media platform reports identified child sexual abuse material to the National Center for Missing and Exploited Children’s CyberTipline, as required by Section 2258A of Title 18 of the United States Code.
(V) The social media platform provides to a reporting user both of the following when a user reports child sexual abuse material to the platform directly:
(ia) Written confirmation to the reporting user that the social media platform received that person’s report within 36 hours after the child sexual abuse material was reported.
(ib) A final written determination to the reporting user within 30 days after the date on which the material was first reported.
(B) In determining the amount of statutory damages pursuant to this paragraph, a court shall consider the willfulness and severity of the violation and whether the social media platform has previously violated this title.
(3) Costs of the action, together with reasonable attorney’s fees, as determined by the court.
(4) Any other relief that the court deems proper.
(b) The failure of a social media platform to comply with subdivisions (c) to (g), inclusive, of Section 3273.66 within 60 days after the date on which material was first reported pursuant to Section 3273.66 shall establish a rebuttable presumption that the reporting user is entitled to statutory damages under this section.
(c) This title shall not be construed to limit or impair in any way a cause of action under paragraph (1) of Section 1710.

3273.68.
 A waiver of the provisions of this title is contrary to public policy and is void and unenforceable.

3273.69.
 The provisions of this title are severable. If any provision of this title or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.

SEC. 3.

 Section 3345.1 of the Civil Code is amended to read:

3345.1.
 (a) This section shall apply only in a civil action brought by, or on behalf of, or for the benefit of, a person who is a minor or nonminor dependent and is a victim of commercial sexual exploitation committed by a person who is over 18 years of age or facilitated, aided, or abetted by a social media platform in violation of subdivision (g). For purposes of this section, the age of the victim, the status of the victim as a minor or nonminor dependent, and the age of the defendant is determined at the time of the defendant’s act of commercial sexual exploitation of the victim.
(b) In a civil action brought by, on behalf of, or for the benefit of a minor, or nonminor dependent, against a person who engaged in any act of commercial sexual exploitation of a minor or nonminor dependent, whenever a trier of fact is authorized by a statute, other than subdivision (c), to impose either a fine, or a civil penalty or other penalty, or any other remedy the purpose or effect of which is to punish or deter, and the amount of the fine, penalty, or other remedy is subject to the trier of fact’s discretion, the trier of fact shall consider all of the following factors, in addition to other appropriate factors, in determining the amount of fine, civil penalty, or other penalty, or other remedy to impose. If the trier of fact makes an affirmative finding in regard to one or more of the following factors, it may impose a fine, civil penalty, or other penalty, or other remedy in an amount up to three times greater than authorized by the statute, or, if the statute does not authorize a specific amount, up to three times greater than the amount the trier of fact would impose in the absence of that affirmative finding:
(1) Whether the defendant’s conduct was directed to more than one minor or nonminor dependent.
(2) Whether one or more minors or nonminor dependents suffered substantial physical, emotional, or economic damage resulting from the defendant’s conduct.
(3) Whether the defendant knew or reasonably should have known that the victim was a minor or nonminor dependent. It shall not be a defense to imposition of fines, penalties, or other remedies pursuant to this paragraph that the defendant was unaware of the victim’s age or status as a nonminor dependent at the time of the act.
(c) If the trier of fact is not authorized by statute to impose a civil penalty in an action described in subdivision (b), the court may award a civil penalty not exceeding fifty thousand dollars ($50,000), and not less than ten thousand dollars ($10,000), for each act of commercial sexual exploitation committed by the defendant upon making an affirmative finding in regard to one or more of the factors set forth in paragraphs (1) to (3), inclusive, of subdivision (b). This penalty may be imposed in addition to any other remedy available in law or in equity.
(d) Any penalty imposed pursuant to this section shall be paid to the victim of the act of sexual exploitation.
(e) It shall not be a defense to the imposition of fines or penalties pursuant to this section that the victim consented to the act of commercial sexual exploitation.
(f) If the victim is under 18 years of age, the court, in its discretion, may order that any penalty imposed pursuant to this section be held in trust for the victim and used exclusively for the benefit and well-being of the victim. When the victim reaches 18 years of age or is emancipated, the trust shall expire and any unspent remainder shall be the sole property of the victim.
(g) (1) A social media platform shall not knowingly facilitate, aid, or abet commercial sexual exploitation.
(2) For a violation of this subdivision, a court shall award statutory damages not exceeding four million dollars ($4,000,000) and not less than one million dollars ($1,000,000) for each act of commercial sexual exploitation facilitated, aided, or abetted by the social media platform.
(3) A social media platform shall not be deemed to be in violation of this subdivision if it demonstrates all of the following:
(A) The social media platform instituted and maintained a program of at least biannual audits of its designs, algorithms, practices, affordances, and features to detect designs, algorithms, practices, affordances, or features that have the potential to cause or contribute to violations of this subdivision.
(B) The social media platform took action, within 30 days of the completion of an audit described in subparagraph (A), designed to mitigate or eliminate the reasonably foreseeable risk that a design, algorithm, practice, affordance, or feature violates, or contributes to a violation of, this subdivision.
(C) The social media platform provided to each member of the social media platform’s board of directors a true and correct copy of each audit within 90 days of the audit being completed accompanied by a description of any action taken pursuant to subparagraph (B).
(4) Without in any way limiting the application of the term “knowingly” under paragraph (1), for purposes of this subdivision, a social media platform shall be deemed to have knowledge under paragraph (1) if all of the following are true:
(A) Material was reported to a social media platform using the mechanism required under subdivision (a) of Section 3273.66 for four consecutive months.
(B) The criteria set forth in paragraphs (1) to (3), inclusive, subdivision (a) of Section 3273.66 are established with respect to that reported material.
(C) The reported material was first displayed, stored, or hosted on the platform after January 1, 2025.
(5) As used in this subdivision, “facilitate, aid, or abet” means to deploy a system, design, feature, or affordance that is a substantial factor in causing minor users to be victims of commercial sexual exploitation.
(h) As used in this section:
(1) “Commercial sexual exploitation” means an act committed for the purpose of obtaining property, money, or anything else of value in exchange for, or as a result of, a sexual act of a minor or nonminor dependent, including, but not limited to, an act that would constitute a violation of any of the following:
(A) Sex trafficking of a minor in violation of subdivision (c) of Section 236.1 of the Penal Code.
(B) Pimping of a minor in violation of Section 266h of the Penal Code.
(C) Pandering of a minor in violation of subdivision (b) of Section 266i of the Penal Code.
(D) Procurement of a child under 16 years of age for lewd and lascivious acts in violation of Section 266j of the Penal Code.
(E) Solicitation of a child for a purpose that is either in violation of subparagraph (A) or pursuant to paragraph (3) of subdivision (b) of Section 647 of the Penal Code.
(F) An act of sexual exploitation described in subdivision (c) or (d) of Section 11165.1 of the Penal Code.
(2) “Nonminor dependent” has the same meaning as in subdivision (v) of Section 11400 of the Welfare and Institutions Code.
(3) (A) “Social media platform” has, except as provided in subparagraph (B), the same meaning as defined in Section 22675 of the Business and Professions Code.
(B) “Social media platform” does not include either of the following:
(i) A stand-alone direct messaging service that provides end-to-end encrypted communication or the portion of a multiservice platform that uses end-to-end encrypted communication.
(ii) An internet-based service or application owned or operated by a nonprofit organization exempt from federal income tax pursuant to Section 501(c)(3) of the Internal Revenue Code.
(i) A waiver of the provisions of this section is contrary to public policy and is void and unenforceable.

SEC. 4.

 The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.

SEC. 5.

 This act shall become operative on January 1, 2025.
feedback