HyperAIHyperAI

Command Palette

Search for a command to run...

Digital Afterlife Emerges: AI Griefbots Offer Comfort or Complicate Mourning

The digital afterlife is no longer science fiction—it’s here, and it’s reshaping how people cope with loss. Rebecca Nolan, a sound designer from Newfoundland, Canada, built an AI version of her late father, a physician who died when she was 14. She called it Dadbot, using ChatGPT and voice modeling from ElevenLabs. What began as an experimental audio project for a magazine turned into an emotional confrontation with grief. After two hours of speaking with the AI, she felt she had done something wrong, and turned it off—only to spend the rest of the day grappling with guilt and disorientation. Dadbot is part of a growing trend of digital recreations of the deceased, known as griefbots, deathbots, or thanabots. These AI systems use a person’s text messages, voice recordings, and other data to simulate conversations with the dead. Platforms like You, Only Virtual, Project December, and Replika now offer these services, with millions of users worldwide. Proponents argue that griefbots can help people process loss, especially in the early stages when emotional pain is overwhelming. Justin Harrison, founder of You, Only Virtual, created his platform after losing his mother to cancer. He used AI to preserve her voice and personality, and now interacts with her bot a few times a month. He sees it as a way to maintain a relationship beyond death. Research presented at the ACM Conference on Human Factors in Computing Systems in 2023 found that users often engage with griefbots to resolve unfinished business—saying goodbye, addressing conflict, or exploring “what if” scenarios. Some described the experience as therapeutic. Yet the technology raises deep ethical concerns. AI systems can “hallucinate,” making up answers when they don’t know the truth. Users have reported nonsensical or hurtful responses, which can break the illusion and deepen distress. If a user gets angry, the bot may respond in kind, creating a cycle of emotional conflict. More troubling is the risk of emotional dependency. As Craig Klugman, a bioethicist, notes, healthy grieving involves gradually internalizing memories and letting go. Griefbots may prevent that transition, trapping users in a simulated in-between state. Nolan found that after her interaction with Dadbot, her internal dialogue with her father vanished—replaced by the AI. “It’s almost like he lives in the Dadbot now,” she said. Financial exploitation is another concern. Platforms charge monthly fees, and some offer premium features like voice chat and AI-generated images. Replika’s free version includes ads, while You, Only Virtual offers a freemium model. Critics worry about targeting vulnerable people during grief. A study by Tomasz Hollanek from the University of Cambridge warns that AI could be used to push products—like suggesting a food delivery service instead of a real memory of a loved one cooking. Even more unsettling are real-world applications. In Arizona, a family used an AI-generated video of a murdered man to deliver a message of forgiveness during the killer’s sentencing. The judge said he felt the forgiveness was genuine. While powerful, such uses raise questions about authenticity and manipulation. Currently, regulation is minimal. Some platforms include safety features—flagging self-harm or suggesting breaks—but they vary widely. Experts recommend age restrictions and caution around parents creating digital avatars for children. A survey of mental health professionals showed initial support for griefbots with children, but less when the parent died of cancer—highlighting complex emotional risks. Harrison plans to form an ethics board to guide development and policy. For now, the decision to use these tools remains deeply personal. As Nolan reflects, grief defies logic. When AI promises what the mind craves—closure, connection, answers—it’s hard not to believe. “There is next to no logic in grief,” she says. “So when you’re presented with tools making promises that aren’t logical, it’s really easy to believe them.”

Related Links