Researchers have been sounding the alarm on the explosion of AI-generated child sexual abuse material using depictions of real victims or virtual characters. Last year, the FBI warned it was continuing to receive reports from victims, both minors and adults, whose photos or videos were used to create explicit content that was shared online.
Several states have passed their own laws to try to combat the problem, such as criminalizing nonconsensual deepfake porn or giving victims the ability to sue perpetrators for damages in civil court.