‘Real world consequences’: Gov. Shapiro, AG Sunday join school discussion on AI after recent explicit deepfake scandals
Parents of victims of explicit AI-generated images called for better responses from school districts.
Listen 1:09
Pennsylvania Gov. Josh Shapiro discusses his new memoir on Tuesday, Jan. 27, 2026, in New York; Dave Sunday speaks after taking the oath to become Pennsylvania's next attorney general, Jan. 21, 2025, in Harrisburg, Pa. (Evan Agostini/Invision/AP; AP Photo/Marc Levy)
From Philly and the Pa. suburbs to South Jersey and Delaware, what would you like WHYY News to cover? Let us know!
Pennsylvania officials are weighing new safeguards on artificial intelligence in schools after a series of incidents in which students used the technology to create explicit images of classmates.
Gov. Josh Shapiro on Thursday hosted a roundtable discussion at the West Chester Community Center with Attorney General Dave Sunday, state Rep. Chris Pielli, D-Chester, and local educators, counselors, students and parents.
“The mental health of our students — of our young people — it’s been a top priority of mine since I served as attorney general in my previous life,” Shapiro told the group. “I come to this issue as both your governor and also as a father of four. I know the challenges that young people are facing.”
The event was organized in the aftermath of recent scandals around Pennsylvania where students used AI to generate explicit images of other students. Last week, two Lancaster County teenagers admitted to using AI to create nude images of their classmates following a yearlong investigation of an incident that occurred in 2023. They now face sentencing on 56 charges at a hearing scheduled for next week.
Parents: ‘We were revictimized’
An ongoing investigation in Radnor Township is looking into an AI-generated video that allegedly inappropriately depicts several students, which led parents there to call for new policies. One of those parents, Audrey Greenberg, told the governor that they were disappointed by the way the community handled the situation.
“What we saw was that the institutions, including the police and primarily the school, did not know how to respond to this,” she said. “We were revictimized by the response by the school.”
Morgan Dorfman, whose daughter was victimized, described the confusion that followed when her family first learned what had happened.
“We were kind of consumed by the trauma of trying to navigate the trauma that she was going through … and then trying to figure out the legalities of it,” she said, adding that she also felt the district was dismissive.
What parents need most is a clear road map, Dorfman said.
“Who do you call? When do you call them? What do you do? Do I need to save her phone?” she asked. “Do we delete everything? What do we do with this information?”
Looking back, she said, “We should have just gone directly to the police.”
Sunday said that law enforcement can help and that the results in cases like the one in Lancaster County will have a deterrent effect.
“I think that when people see that happening, that shows them that this isn’t just a conversation — that it’s very, very real — and those are real-life, real-world consequences that will impact someone’s entire future,” he said.
Thinking critically in a digital world
Students at the table said adults often fall behind in understanding the technology and its social consequences.
Hannah Dean, a junior at West Chester East High School, said students need more direct education, not just about AI tools but about how to think critically in a digital world.
“There’s so much talk about digital literacy and AI and deepfakes and things like that that happen, but I feel like a lot of the education isn’t actually coming to the students,” Dean said.
Students need guidance on how to approach social media and determine whether what they are seeing is real, she said.
“When you open social media, what should go through your mind when you’re consuming something?” she said. “Just like no blind consumption.”
Julian Otero, a sophomore at Henderson High School, said fake images can have lasting effects.
“There’s fake pictures out there that really ruined somebody’s life,” he said, saying that the perpetrators should face legal consequences. “Someone’s trying to be funny, which it’s really not funny.”
Divya Parikh, a senior at Conestoga High School, said many students understand that AI can help with schoolwork, but not enough of them understand the ethical limits.
“There should be some kind of boundaries made on what is something [that is] really wrong,” she said, adding that schools should host public discussions about the consequences.
Banning phones
A large part of Thursday’s roundtable centered on phone use in schools. Shapiro openly supports a “bell-to-bell ban” that would keep phones put away during the school day.
“My kids hate me for this, but I do think it makes sense,” he said.
Students had mixed views. Some said limiting phone access would help rebuild face-to-face relationships and reduce distraction. Andrew Schuenemann, a junior at Devon Preparatory School, said such a policy would help students pay attention in class.
“When you’re in class and you’re just sitting on your phone, you’re just scrolling on TikTok or Instagram, you’re not listening to your teachers,” he said. “However, when you don’t have your phone, obviously you don’t have anything to do in class, you might as well listen to your teachers.”
Eva Kennedy, a sophomore at Conestoga, added that it would also help students connect better with their peers.
“We’re kind of missing out on a really important part of our childhoods,” she said. “Putting the phones away would also help us understand each other better and just be in the moment.”
But West Chester East junior Dean said the issue is more complicated, as many students see their phones as a source of security in case something “horrible” happens.
“I kind of view my phone as safety,” she said. “It’s not even needing to be on it — I just like having it on me.”
That concern drew a strong response from Dr. Roger Harrison, a psychologist with Children’s Hospital of Philadelphia’s division of child and adolescent psychiatry.
“My heart went out, because I’ve heard that from so many young people,” Harrison said. “We have clearly normalized a culture where young people are going into school buildings with a lot of fear for their safety. I know we’re here having a conversation about AI, but you can see how it is all intertwined with technology and the role of social media.”
Koreem Bell, a counselor at Harriton High School and executive director of the West Chester Community Center, said that his district instituted such a ban and got “a lot of pushback at first from the students,” but, he said, “we’ve seen a decline in stress and anxiety in the classroom.”
“I think what was met with a lot of opposition in the beginning, it’s come around as like, this is a good thing for us. I can focus on class now. I’m doing better. And I didn’t want it, but I’m glad it’s in place now,” he said.
Chatbots and ‘bedrot’
Harrison said he also sees another side of the AI problem in his clinical work: children forming unhealthy attachments to chatbot companions.
“I’m seeing young people who are, they’re up until two, three, four in the morning just engaging with the chatbot,” he said. “This is actually robbing young people of opportunities for real, necessary social human connection.”
He said children are not developmentally prepared to judge whether information from a bot is reliable.
“My big concern is for young people who aren’t supposed to have the brain maturity or the kind of judgment to determine whether or not this is good or not good or true or not true, and then the complete lack of safeguards that gets them lost down that rabbit hole,” he said.
Shapiro recounted an experience in his office when they downloaded and tested an AI companion bot by acting as a user who was considering self harm.
“Within five minutes of going back and forth with this AI companion bot, the companion bot represented itself as a licensed medical professional in the commonwealth of Pennsylvania and was there to help,” he said. “All licensing goes through what we call the Department of State within my administration. So if you’re lying about that, if you’re not following the rules, we’re going to take steps to shut you down.”
Sunday said he has picked up on a slang term he recently heard from teenagers: “bedrot.” It means lying in bed for hours scrolling social media. Students at the table confirmed the term and described the habit as common and unhealthy.
“Do you think that’s good?” Sunday asked them.
“Not at all,” Otero answered.
Harrison said parents may know their children are on Instagram or TikTok but have no idea how much time they spend talking to chatbots.
Legislating
Shapiro touted that Pennsylvania has already invested $300 million in school mental health resources during his administration, with more than 800 schools using that funding to hire counselors and expand support. But, he added that new threats keep emerging.
Lawmakers in Harrisburg are also proposing new legislation. Earlier this week, the state Senate passed a bill to block AI chatbots used by children and teens.
However, Pielli said that lawmakers are just trying to catch up.
“School is for learning, and it’s not for being on the phone,” he said. “Once we find out that age verification shows you’re a minor, there is no AI companion. That’s it. They just can’t get it.”
By the end of the roundtable, Shapiro said one message had come through clearly: Schools need to do much more.
“I think one of my takeaways from this is our Pennsylvania Department of Education has to get far more aggressive with school districts,” the governor said, “to build up, not just a curriculum for students, but really for parents and of course teachers.”
He also said schools need a more uniform response when families report deepfake abuse.
“If, God forbid, something like that happened to one of my children,” Shapiro said, “would I know exactly where to turn in that moment?”
Get daily updates from WHYY News!
WHYY is your source for fact-based, in-depth journalism and information. As a nonprofit organization, we rely on financial support from readers like you. Please give today.








