Malaysia Oversight

When AI brings back the voices of the dead

By NST in September 16, 2025 – Reading time 3 minute
When AI brings back the voices of the dead


DIEGO Felix Dos Santos never expected to hear his late father’s voice again — until AI made it possible. “The tone of the voice is pretty perfect,” he says. “It feels like, almost, he’s here.”

After the 39-year-old’s father unexpectedly passed away last year, Dos Santos travelled to his native Brazil to be with family.

It was only after returning to his home in Edinburgh, Scotland, that he says he realised, “I had nothing to actually remind (me of) my dad.”

What he did have, though, was a voice note his father sent him from his hospital bed.

In July, Dos Santos took that voice note and, with the help of Eleven Labs — an artificial intelligence-powered voice generator platform founded in 2022 — paid a US$22 monthly fee to upload the audio and create new messages in his father’s voice, simulating conversations they never got to have.

“Hi son, how are you?” his father’s voice rings out from the app, just as it would on their usual weekly calls.

“Kisses. I love you, bossy,” the voice adds, using the nickname his father gave him when he was a boy.

Now, he and his wife, who was diagnosed with cancer in 2013, are considering creating AI voice clones of themselves, too.

Dos Santos’ experience reflects a growing trend where people are using AI not just to create digital likenesses, but to simulate the dead.

As these technologies become more personal and widespread, experts warn about the ethical and emotional risks — from questions of consent and data protection to the commercial incentives driving their development.

The market for AI technologies designed to help people process loss, known as “grief tech”, has grown exponentially in recent years.

Ignited by US startups such as StoryFile (an AI-powered video tool that lets people record themselves for posthumous playback) and HereAfter AI (a voice-based app that creates interactive avatars of deceased loved ones), this tech markets itself as a means to cope with, and perhaps even forestall, grief.

Robert LoCascio founded Eternos, a Palo Alto-based startup that helps people create an AI digital twin, last year after losing his father.

Since then, more than 400 people have used the platform to create interactive AI avatars, LoCascio said, with subscriptions starting from US$25 for a legacy account that allows a person’s story to remain accessible to loved ones after their death.

Michael Bommer, an engineer and former colleague of LoCascio’s, was among the first to use Eternos to create a digital replica of himself after learning of his terminal cancer diagnosis.

LoCascio said Bommer, who died last year, found closure in leaving a piece of himself behind for his family. His family has found closure from it too.

“It captures his essence well,” his wife Anett Bommer, who lives in Berlin, Germany, told Reuters in an email.

“I feel him close in my life through the AI because it was his last heartfelt project and this has now become part of my life.”

The goal of this technology isn’t to create digital ghosts, said Alex Quinn, the chief executive officer of Authentic Interactions Inc, the Los Angeles-based parent company of StoryFile.

Rather, it’s to preserve people’s memories while they’re still around to share them.

“These stories will cease to exist without some type of interference,” Quinn said, noting that while the limitations of AI clones were obvious — the avatar would not know the weather outside or who the current president is — the results were still worthwhile.

“I don’t think anyone ever wants to see someone’s history and someone’s story and someone’s memory completely go.”

One of the biggest concerns surrounding grief tech is consent: What does it mean to digitally recreate someone who ultimately has no control over how their likeness is used after they die?

While some firms such as Eleven Labs allow people to create digital likenesses of their loved ones posthumously, others are more restrictive.

LoCascio from Eternos, for example, said their policy restricted them from creating avatars of people unable to give their consent and they administered checks to enforce it, including requiring those making accounts to record their voice twice.

“We won’t cross the line,” he said. “I think, ethically, this doesn’t work.”

The writer is from Reuters

© New Straits Times Press (M) Bhd



Source link