Families urged to use safe words when online amid dangerous rise of AI deepfakes

With the advancements of AI paving the way for the future - one worrying aspect is the surge in fake videos, and 'deepfakes' emerging online.
It's getting increasingly hard to tell the difference between AI generated images and videos and real life images and videos which can be a worry for people's safety.
There are AI generated videos cropping up all over TikTok of celebrities and notable figures that look scarily realistic. From Jake Paul to Tupac and Robin Williams it's pretty scary stuff that is now sparking real world fears. Now people are recommending families and friends to agree on 'safe words' and passwords personal to them to help protect against impersonation scams.
READ MORE: 'I dined like a Downton Abbey lady at The Ivy - there's one thing I never could have done'READ MORE: Gen Z is turning to the stars to guide their dating lives and this is whyThe threat of deepfakes is very worrying, as targeted scammers are cloning people and their voices, and creating fake video calls to then demand money - which can then lead to huge consequences for victims.
Blurred lines between what's real and what's fake is rife, and one TikToker, who goes by the name @chelseaexplainsitall shared just how easy it is to play around and create these fake videos. She shared a TikTok of herself using OpenAI's video app, Sora, which is a text-to-video platform creating the realistic videos.
Sora has currently had over one million downloads and is sitting at the top of the iOS App Store. Users can generate a short video in less than a minute of people in situations they were never in.
For more stories like this subscribe to our weekly newsletter, The Weekly Gulp, for a curated roundup of trending stories, poignant interviews, and viral lifestyle picks from The Mirror's Audience U35 team delivered straight to your inbox.
After trying out the app, Chelsea urged people: "You and your family need a safe word." While these deepfake video can mimic your voice and face as well as mannerisms, they won't be able to coy secret words.
"These deepfakes are crazy, the scams are going to be insane," Chelsea said. "It's so realistic, it's actually frightening". People were quick to jump to the comments, including one person who said: "Guys.. When I tell you. As someone working in banking. Protect yourself and your loved ones."
Another potential issue is the use of deceased celebrities being used in these deep fake videos. Experts worry about the potential issue of historical misinformation, as well as deceased figures who cannot consent or opt out of the AI models.
A spokesperson for OpenAI told NBC News: “While there are strong free speech interests in depicting historical figures, we believe that public figures and their families should ultimately have control over how their likeness is used. For public figures who are recently deceased, authorized representatives or owners of their estate can request that their likeness not be used in Sora cameos.”
Help us improve our content by completing the survey below. We'd love to hear from you!
Daily Mirror