A new artificial intelligence model is making waves for its ability to create realistic videos.
OpenAI, the company that makes ChatGPT, released Sora 2 last week — and with it came a social media app populated entirely by AI-generated videos.
An NPR’s All Things Considered story said the early results of the new model “are both wowing and worrying researchers.”
The Conversation spoke with ʻIolani School emerging technologies teacher Gabriel Yanagihara about what to know about the new video generation model and its potential for realistic deepfakes.

“You can type in anything you want, feed it any sort of raw information, and it'll be able to create video for you. And this is unique in that it's creating video that never existed before, it's completely synthetic," he said.
Yanagihara has experimented with Sora 2’s deepfake feature, which enables users to upload a few seconds of someone's face and voice, and then direct the app to generate a version of that person performing a requested action.
The potential misuse of deepfakes can lead to sophisticated cybersecurity and social engineering issues, he said.
“A lot of the work that we're going to have to do addressing these new technologies and what they can do is just literacy,” he told HPR.
“Being able to look at content and do all the critical thinking and fact checking to verify if this is, in fact, something that's happened through vetted sources or through good research on your own, is going to be like a life skill that everyone's going to need to have, or we're going to enter this era where we're kind of past knowing if anything's ever real, ever again.”
One rule Yanagihara likes to recommend to others is to fact-check videos that elicit an emotional reaction.
“One of the biggest strengths of AI literacy is, once you understand how the technology works, even at a very basic level, it kind of inoculates you a little bit,” he said. “It kind of protects you a little bit from the visceral reaction, because now you have this layer of kind of suspicion on top of everything you see.”
This story aired on The Conversation on Oct. 8, 2025. The Conversation airs weekdays at 11 a.m. Hannah Kaʻiulani Coburn adapted this story for the web.