In the United States, a mother receives a disturbing call that sounds like her 15-year-old daughter has been kidnapped while on a skiing vacation. However, it is an AI that imitates her daughter's voice.

Kidnapping just faked!

Criminals in the USA tried to scam a mother over the phone using a particularly devious and nasty method. Jennifer DeStefano of Arizona reports that scammers imitated her daughter's voice to fake a kidnapping and demand a ransom.

DeStefano told station WKYT she received a call from an unknown number. Because her 15-year-old daughter Brie was on a skiing vacation, she answered the call out of concern about a possible accident. Then she heard her daughter's voice saying "Mom" and sobbing.

“I said, 'What happened?' And she said, 'Mom, I messed up,' and she's sobbing and crying."

“I asked, ‘What happened? And she replied, 'Mom, I made a mistake,' and she sobbed and cried."

The mother later emphasized that she had no doubt that it was her daughter's voice. “It was her voice. It was her way of speaking. She would have cried like that ,” she said. Then a man's voice explained that it was a kidnapping and that he had her daughter in his power. He would release her in return for a payment of one million dollars. When DeStefano said she didn't have that much money, the demand was reduced to $50,000.

While DeStefano was at her other daughter's dance studio, a mother friend immediately called the police and another contacted her husband. Within a very short time it was clear that your daughter was safe and had not been kidnapped.

The fact is: the girl's voice was generated by artificial intelligence.

WKYT quotes an AI expert and computer science professor from Arizona State University as saying that voice generation technology is advancing rapidly. “You can’t believe your ears anymore” . In the past, this required a lot of vocal samples, but today a three-second excerpt is enough. This is also reported by “derStandard” New Microsoft AI can imitate voices after just three seconds

According to the FBI, criminals often use voices from social networks. To protect yourself, profiles should be set private and not publicly visible. However, in DeStefano's case, her daughter did not have a public social media account.


Voice imitation with AI?

Voice imitation with AI, also known as speech synthesis or speech cloning, refers to the use of artificial intelligence to imitate or reproduce the human voice. Here is an overview of the process of voice imitation with AI:

  1. Data collection: First, a large amount of voice recordings of the person whose voice is to be imitated are collected. These recordings should cover as many different speaking situations and tones as possible in order to obtain a comprehensive representation of the voice.
  2. Pre-processing: The voice recordings are pre-processed to remove background noise and other interference. The data is then divided into smaller segments to make analysis easier.
  3. Feature extraction: Features such as pitch, loudness, timbre and speech rate are extracted from the pre-processed speech data. These features serve as the basis for training the AI ​​model.
  4. AI model training: A neural network, such as a Long Short-Term Memory (LSTM) or a Transformer model, is trained with the extracted features. The model learns to recognize patterns in the data and generalize them to produce the most similar voice possible.
  5. Text-to-speech synthesis (TTS): After the AI ​​model is trained, it can be used for voice imitation. To do this, text is entered into the model, which then produces audio output that sounds similar to the imitated voice.
  6. Fine-tuning: The voice imitation produced can be further optimized by iteratively adjusting and improving the model. This can be done through additional training or by adjusting the model's hyperparameters.

Investigators also recommend remaining calm when receiving similar calls and being alert to calls from unknown or international numbers.
It makes sense to ask questions about personal information that the kidnappers can hardly know. Several such blackmail calls have already become known in the USA. DeStefano only experienced a few minutes of fear before it became apparent that her daughter was safe. She didn't pay any ransom. Still, she broke down emotionally afterward, saying, “There were tears for all the what-ifs. Everything seemed so real.”

DeStefano also warned on Facebook, saying: “The only way to stop this is with public awareness.”

Facebook

By loading the post you accept Facebook's privacy policy.
Learn more

Load post

This incident shows how advanced AI technologies have become and how it is becoming increasingly important to protect yourself from such scams. In such cases, it is advisable to contact the police and not make any payments until the situation is resolved. Experts recommend being particularly vigilant when calling from unknown or international numbers and asking questions that the kidnappers cannot answer. Furthermore, personal information on social media should be kept to a minimum and profiles should be kept in private mode to reduce the risk of such fraud.

YouTube

By loading the video, you accept YouTube's privacy policy.
Learn more

Load video

Source:
WKYT: 'I've got your sucker': Mom warns of terrifying AI voice cloning scam that faked kidnapping


Also read:
Misuse of ChatGPT: AI model used to create malware
ChatGPT – How fraudsters and criminals use artificial intelligence for fraud, misinformation and cybercrime

Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )