Select Language

English

Down Icon

Select Country

Spain

Down Icon

They clone your voice with AI to fake a kidnapping: this is the scam

They clone your voice with AI to fake a kidnapping: this is the scam

They clone your voice with AI to fake a kidnapping: this is the scam
With just 3 seconds of your voice, AI can create a fake kidnapping call. Learn how this dangerous scam works and how to protect your family.

You get a call: it's a panicked relative demanding money. The voice is identical, but it's an AI scam. The FBI has issued an alert about this new form of extortion that uses your own voice against you. We'll explain how to identify it and protect yourself.

Voice cloning technology has advanced at a dizzying pace. AI tools, some of them publicly available, can analyze a very short audio sample to replicate a person's unique pitch, cadence, and inflections.

Scammers use this technology to fabricate voice messages or even hold real-time conversations, creating a believable emergency scenario. According to security reports, the goal is to create a state of panic in the victim so they act impulsively and transfer money without verifying the situation.

"If you receive a message claiming to be from a senior U.S. official, don't assume it's authentic. Malicious actors are increasingly exploiting AI-generated audio to impersonate public figures or personal connections and boost the credibility of their schemes." – FBI alert.

Although the technology is advanced, it's not perfect. The FBI and cybersecurity experts recommend paying attention to certain signs that can reveal deception.

  • Flat or awkward emotional tone: Generative AI still struggles to replicate complex emotions naturally. The voice may sound monotonous or have odd inflections despite the distressing content of the message.
  • Extreme urgency and pressure: Scammers insist that there's no time to think and that you must act immediately. They'll tell you not to hang up or try to contact anyone else.
  • Unusual payment requests: They typically ask for immediate bank transfers, cryptocurrency, or the purchase of gift cards, methods that are difficult to trace and reverse.
  • Poor call quality : Scammers often pretend there is poor signal or interference to justify any voice anomalies or to avoid answering complex questions.

Prevention and calm are the best defenses. Share these steps with your family members, especially the most vulnerable, such as older adults.

  • Establish a family "safe word": Agree on a secret word or phrase that only you and your loved ones know. If you receive an emergency call, ask the other party to use the safe word. A scammer won't know it.
  • Ask a personal question that only they would know: Ask a question whose answer can't be found on social media. For example, "What was our first pet's name?" or "What restaurant did we celebrate your last birthday at?" Failure to answer will give away the fraud.
  • Hang up and verify by another means: Even if the scammer insists you don't hang up, do it. Immediately afterward, call the supposedly endangered person on their usual phone number or contact another family member to verify the situation. Don't use the number they called you from.
  • Be wary of urgency: Never make financial decisions under pressure. Take a moment to breathe and think logically. No real emergency will be resolved solely by an immediate money transfer to a stranger.

If you believe you've been a victim of this type of scam, report it immediately to local authorities and your country's cybersecurity police. Sharing this information is key to preventing more people from falling into this trap.

Giovanna Cancino
La Verdad Yucatán

La Verdad Yucatán

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow