
When “He Says ‘Hi Son’” Isn’t Just Memory: How AI’s Voice Cloning Is Redefining How We Grieve
Grief used to mean silence, photographs, memories stored in old letters. Now some people are hearing their lost loved ones speak again — through artificial intelligence. Creating voices from voice notes. Building avatars that chat. It’s comfort, sure. But also tricky territory.
What’s Going On
Diego Felix Dos Santos, 39, missed something simple after his father died. A voice. Then came a voice note from hospital, and the rest: uploading that sample onto Eleven Labs (a voice-AI service) to generate messages in his dad’s voice.
He now hears greetings like “Hi son, how are you?” in familiar tone — new messages in a voice he thought lost forever.
Companies like StoryFile, HereAfter AI, Eternos and others are stepping in. They’re offering “grief tech”: services that let you create avatars, voice clones, digital twins of relatives who have passed. For some, that’s healing. For others, it raises eyebrows.
The Sweet Spot & The Sharp Edges
People who’ve used these tools often say they’re not replacing mourning — but adding something gentle.
Anett Bommer, whose husband used Eternos before he died, calls it “part of my life now” — a project he built for his family. She didn’t lean on the avatar during the hardest grief, but it became something precious afterward.
But experts caution: this comfort isn’t without cost. What about consent (especially posthumous)?
What about emotional dependency — could someone get stuck in grief by holding onto these digital echoes? And then there’s the data privacy mess. Who owns the voice? Could it be misused later?
Cambridge University researchers want ongoing consent, transparency, protections for sensitive data. Because capabilities evolve fast, but laws and emotional readiness might lag.
Why It Matters — More Than You Think
This isn’t sci-fi. It’s real life changing. Here are some of the larger ripples:
- Mental health & grief work: Therapists are cautious. AI voice clones might help with closure for some, but for others, they risk delaying acceptance or complicating the natural filing away of grief.
- Ethical precedents: If digital afterlives become more common, societies will need clear frameworks. How to define consent before death? What about rights over a person’s voice or likeness after they’re gone?
- Regulation & commercial versus personal use: Companies charging subscriptions, selling “legacy accounts” — that’s fine if handled carefully. But commercialization might pressure corners: less strict consent, leaks, repurposing voices without proper oversight.
- Cultural, religious, personal variation: Not everyone accepts voice clones or avatars. For some, relics, rituals, faith carry the weight of remembrance. For others, this tech opens new paths to healing. There’s no one-size-fits-all.
What to Watch, What to Ask Yourself
Before you try something like this (if you ever consider it for your own loss), here are some questions & cautions:
Question | Why It’s Important |
Did the deceased give consent before death (voice recordings, likeness)? | It affects legality and moral right to create AI version. |
Can you control what’s done with their digital data later? | Ensures voice / likeness isn’t misused commercially or manipulated. |
Is there a plan for “turning off” or retiring the avatar if needed? | Ties into emotional dependency issues. |
How might this affect your grieving process over time? | Might help some, but could stall emotional acceptance for others. |
My Take
I feel a pull toward what these tools offer: relief, closeness, something to hold onto. Grief is brutal, unpredictable. When someone gives you “just one more chance” to connect — even if virtually — there’s something sacred in that.
But I also worry. There’s a fine line between comfort and illusion. Between preserving memory and delaying farewell. Between tool and crutch.
As this grief tech grows, it needs guardrails: ethical design, honest marketing, clear user education. Because nothing should exploit grief for profit, or promise more than it can deliver.
This feels like the start of a deep conversation — about loss, legacy, what presence means when someone’s gone. AI voices aren’t ghosts. They’re echoes. Use them wisely.