at the annual re:Mars conference in Las Vegas, Amazon's Senior Vice President and Head Scientist for Alexa, Rohit Prasad, announced a number of new and upcoming features for the Amazon Echo, better known as Alexa. A potential new function is causing discussion: the voice assistant should be able to imitate any voice using short audio clips.
The dead grandmother reads stories
In the scenario presented at the event, the voice of a deceased loved one (in this case, a grandmother) is used to read " The Wizard of Oz " to a grandchild. Prasad notes that with the new technology, the company is able to achieve very impressive audio output with just a minute of speaking time.
“This required developments where we had to learn to produce a high quality vocal in less than a minute of recording time, as opposed to hours of recording in the studio. We did it by viewing the problem as a task of language conversion rather than a way of generating language. We are undoubtedly living in the golden age of AI, where our dreams and visions of the future become reality.”
Rohit Prasad
Prasad explained that while Alexa could mimic any voice, the feature is useful for memorializing a deceased family member, especially given that " so many of us have lost someone we love during the Covid-19 pandemic. " .
“ While AI can’t erase the pain of loss, it can definitely make the memories last ,” he added.
Amazon did not provide further details or a time when the new functions will be implemented in Alexa.
Really a good idea?
One of Alexa's other new functions is that it will become more interactive, for example by asking how the day was. And that just begs the question: Do we really want that?
Do we want the voice of the deceased grandmother to suddenly appear out of nowhere and ask about the day? Preferably while you're watching a horror film... because it's not scary enough yet! Is a computer-generated Alexa voice actually a good keepsake for the deceased?
Not to forget that these audio deepfakes give fraudsters completely new options: For example, users suddenly receive a “real” voice message from a celebrity, where the sentences were simply recorded beforehand with an Alexa. Or targeted scams in which the daughter's voice is secretly recorded in order to use an Alexa voice to ask the parents for money.
So technically this is certainly a very interesting development, but using it in an Alexa could also lead to all sorts of fraud. And I don't imagine the practical application being particularly pious either:
“Grandma, could you please talk for a minute? I want to record your voice and then when you die, my Alexa will have your voice. Great, huh?”
Beautiful new world.
Sources: TechCrunch , Reuters , CNBC
Also interesting:
Researchers conducted surveys with over 1,000 intelligent voice assistant users!
Smart but stressful? Intelligent personal assistants accompany people around the world every day. – Siri, Alexa and Co stress users out over time
If you enjoyed this post and value the importance of well-founded information, become part of the exclusive Mimikama Club! Support our work and help us promote awareness and combat misinformation. As a club member you receive:
📬 Special Weekly Newsletter: Get exclusive content straight to your inbox.
🎥 Exclusive video* “Fact Checker Basic Course”: Learn from Andre Wolf how to recognize and combat misinformation.
📅 Early access to in-depth articles and fact checks: always be one step ahead.
📄 Bonus articles, just for you: Discover content you won't find anywhere else.
📝 Participation in webinars and workshops : Join us live or watch the recordings.
✔️ Quality exchange: Discuss safely in our comment function without trolls and bots.
Join us and become part of a community that stands for truth and clarity. Together we can make the world a little better!
* In this special course, Andre Wolf will teach you how to recognize and effectively combat misinformation. After completing the video, you have the opportunity to join our research team and actively participate in the education - an opportunity that is exclusively reserved for our club members!
Notes:
1) This content reflects the current state of affairs at the time of publication. The reproduction of individual images, screenshots, embeds or video sequences serves to discuss the topic. 2) Individual contributions were created through the use of machine assistance and were carefully checked by the Mimikama editorial team before publication. ( Reason )

