Skip to main content

Berman Introduces Legislation to Outlaw Child Sexual Abuse Images Generated by AI

For immediate release:

SACRAMENTO – Today Assemblymember Marc Berman (D-Menlo Park) announced AB 1831, which will criminalize the creation, distribution, and possession of artificial intelligence-generated Child Sexual Abuse Material (CSAM). The bill aims to address the emerging challenges posed by artificial intelligence (AI) technologies that can produce disturbing and harmful content resembling actual children.

“The sexual exploitation of children must be illegal, full stop. It should not matter that the images were generated by AI, which is being used to create child sexual abuse material (CSAM) that is virtually indistinguishable from a real child,” said Assemblymember Marc Berman. “We must stop the exploitation of children. It is critical that our laws keep up with rapidly evolving AI technology to ensure predators are being prosecuted and children are being protected.”

“Safeguarding our children from potential abusive or exploitative practices is imperative, and as new technologies present new challenges, we must do everything we can to ensure their safety in an ever changing world,” said SAG-AFTRA Los Angeles Local President Jodi Long. “We are deeply concerned by the threat of computer-generated and artificial intelligence generated child sexual abuse material (CSAM), and this legislation is an important step to preventing these dangerous practices.”

Ventura County District Attorney Erik Nasarenko, a cosponsor of the bill stated, “As technology evolves, so must our laws. This bill sends a clear message that our society will not tolerate the malicious use of artificial intelligence to produce harmful sexual content involving minors.”

“Confronting an unprecedented epidemic of social media-facilitated teen suicides, we urgently need to ensure that our criminal statutes against abusing or endangering children clearly reach the sickening and public sexualization of their appearances,” said Ed Howard, Senior Counsel, Children’s Advocacy Institute, University of San Diego School of Law.

"Common Sense Media applauds Assemblymember Berman's efforts to safeguard children online with the introduction of this bill, AB 1831, which builds on the success of AB 1394, a bill that was signed into law last year to help stamp out the deeply harmful problem of online child sex trafficking. This new bill employs a similarly proactive approach, this time protecting kids and teens against online exploitation that is exacerbated by the rise of AI. California should take the lead when it comes to protecting kids and families from the negative impacts of this powerful new technology,” said James P. Steyer, founder and CEO, Common Sense Media.

Existing law prohibits the manufacture, distribution, and possession of CSAM depicting an actual child. It does not address computer-generated images that depict the likeness of a child, even those that are virtually indistinguishable from an actual child, or those that are computer-generated to look like a known child or child celebrity. AB 1831 would prohibit the creation and possession of obscene CSAM images generated by artificial intelligence.

CSAM is a visual depiction of sexually explicit conduct involving a minor. Research has shown a correlation between the consumption of CSAM and an increased risk of individuals engaging in hands-on sexual offenses against minors.[1] Viewing and possessing such material can contribute to desensitization and may escalate criminal behavior. Additionally, a recent study revealed that machine-learning models were trained on datasets containing thousands of depictions of known CSAM victims, revictimizing the children by using their likeness to generate AI CSAM images into perpetuity.[2]

[1] Rivera, A. A. (2019). Child Pornography and Child Sexual Abuse in the 21st Century: A Critical Overview of the Technological Shifts in Distribution, the Academic Literature, and the Current Re-Entry Policies for Convicts of these Crimes (Master's thesis). Rochester Institute of Technology; Bourke, M. L., & Hernandez, A. E. (2009). Butner Study Redux: A Report of the Incidence of Hands-on Child Victimization by Child Pornography Offenders. Journal of Family Violence, 24(3), 183-191.

[2] Thiel, D. (2023). Identifying and Eliminating CSAM in Generative ML Training Data and Models. Stanford Internet Observatory.