Last year, WIRED reported that deepfake porn is on the rise, and researchers estimate that 90 percent of deepfake videos are pornographic, and the vast majority of them involve women without their consent.But despite how pervasive the problem is, Cary Williams, a Columbia University researcher who has tracked legislation around non-consensual deepfakes, says she’s seeing lawmakers pay more attention to political deepfakes.
“Many states are more interested in protecting the integrity of their elections in that way than they are in addressing the issue of sexualized imagery,” she said.
Matthew Bierlein, a Republican representative from Michigan and co-sponsor of his state’s non-consensual deepfakes bill, says he got started on the issue after researching political deepfake laws: “Our plan was to make it a campaign finance violation if[political deepfakes]didn’t include a disclaimer to inform the public.” Bierlein says his work on political deepfakes led him to work with Democratic Representative Penelope Czernoglu, who spearheaded the non-consensual deepfakes bill.
Back in January, Taylor Swift’s non-consensual deepfakes made headlines. “We saw an opportunity to do something,” Beirlein said. And because Michigan, unlike neighboring states, has a full-time legislature with well-paid officials (most states don’t), Beirlein said he felt he was in a position to be a regional leader in the Midwest. “We understand this isn’t just a Michigan issue, but there’s a lot we can start doing at the state level,” he said. “If we get this going, maybe Ohio will adopt this in their legislature, or Indiana or Illinois will adopt something similar and it will be easier to enforce.”
But the penalties for creating or sharing deepfakes without consent, and who is protected, can vary widely from state to state. “The U.S. has a very inconsistent picture on this issue,” Williams says. “I think there’s a misconception recently that these laws have been passed across the country. I think what people are seeing is a lot of legislation being proposed.”
Some states allow both civil and criminal proceedings against perpetrators, while others only allow one or the other. For example, a recent law in Mississippi targets minors. Over the past year or so, there have been a series of cases of middle and high school students using generative AI to create lewd images and videos of their classmates, especially female students. Some laws target adults, with lawmakers essentially updating existing laws banning revenge porn.
According to Williams, while there is broad agreement that deepfakes of non-consensual minors are an “intrinsic moral wrong,” the law is “muddy” on what is “ethical” when it comes to deepfakes of non-consensual adults. In many cases, laws and proposed legislation require people to prove intent that the purpose of creating and sharing non-consensual deepfakes was to harm the subject.