Pa. Legislature passes bill criminalizing malicious use of AI-generated porn

Zack Hoopes / pennlive.com

HARRISBURG — A bill that would make it a crime to disseminate AI-generated “deepfake” pornography of nonconsenting people has passed both chambers of the Pennsylvania Legislature, sending it to Gov. Shapiro’s desk.

The bill also adds a definition to the state criminal code for “child sexual abuse material,” which would stand to correct a defect with an earlier bill that replaced “child pornography” with the more up-to-date term.

The bill, which first cleared the Senate over the summer, was amended before receiving a final vote in the House and a concurrence vote in the Senate on Wednesday.

The concept of the bill remains the same, however. The bill would add the term “artificially generated sexual depiction” to the statute regarding the malicious creation and dissemination of explicit images, and add a similar definition of “artificially generated child sexual abuse material” to the statute on child abuse.

Such materials are defined as any image that uses artificial intelligence or computer generation and “appears to authentically depict an individual in a state of nudity or engaged in sexual conduct that did not occur in reality.”

The bill is necessary, legislators said, in order to combat “deepfake” pornographic images used to harass a person by appearing to show them in a sexual situation that did not actually happen.

Sen. Jimmy Dillon, D-Philadelphia, described “a growing and deeply concerning issue” at schools, noting the recent scandal in Westfield, N.J., where AI-generated pornography appearing to depict students as young as 14 was widely disseminated in a high school.

Last year, attorneys general from all 50 states and several U.S. territories issued a letter to Congress saying that additional federal legislation was needed to fully prosecute those transmitting AI-generated child sex material.

A report late last year from Stanford University’s Cyber Policy Center found what researchers said were hundreds of images of child sexual abuse included in a vast dataset used to train AI models, increasing the likelihood that AI programs will become better and better at replicating child pornography.