top of page

An AI model will take your place when you die

…and it will misrepresent your personality to your friends and family. Enjoy!



AI chatbots are being used by people to chat with synthetic versions of their exes — and, the dead. People are training AI chatbots — let’s call these ‘GriefBots’ — on the digital footprints of the deceased, and then conversing with them. How might this phenomenon alter the grieving process and our ability to move forward? Let’s dig in.


Grieving and technology have always been entangled. We’ve long used technologies to maintain a connection with lost loved ones while moving forward in life. My mom passed unexpectedly in 2019 right before the pandemic, and on special occasions like her birthday or the anniversary of her passing, I’ll listen to an old voicemail or watch a video chat we had over Marco Polo, the asynchronous video chat app. I treasure them. My annual traditions and rituals have helped me sustain a connection to my mom while moving forward with my life.


But a GriefBot is quite different from these artifacts. It is not a static and authentic representation of a moment in time like an image or voicemail. It is instead a synthetic sliver of a person’s online presence, tweaked by developers and technologists, and maintained by the labor and profit margins of technology companies. It’s no longer something from the past to listen to or look at — it is interactive and dynamic. It extends your relationship with the dead into the present.


GriefBots take grieving — which, in many cases may be a social process, that benefits from being witnessed — and enclose it as an individual experience. As grief expert David Kessler writes in his book Finding Meaning: The Sixth Stage of Grief, “We need a sense of community when we are in mourning because we were not meant to be islands of grief. The reality is that we heal as a tribe.” This resonates with me — shortly after my mom passed, I was alone in a tiny apartment in Brooklyn amidst the pandemic. My experience was complicated by the fact that no one was there to witness my grief. You could imagine interacting with a GriefBot having a similar effect, and in turn shaping the process of meaning-making, which is the story we tell ourselves about our relationship to the deceased, their past, and their passing


GriefBots alter who and what contributes to meaning-making. Since grieving is typically social, friends and family contribute unknown anecdotes, offer different accounts of past events, and add nuance and layers of complexity to the life of the dead. Kessler says that the stories we tell ourselves repeatedly about the dead become our meaning. He explains how the story he told himself about his mom, “kept me imprisoned in pain” but over time, “other points of view freed me.” But with a GriefBot, your questions and responses shape its outputs, as well as the limited data used to train the bot and the decisions of developers whom you’ve never met. Meaning is still made collaboratively, but the inputs are wildly different and scattered across many sources: friends, family, bits of data, and strangers working within specific (number go up!) incentives.


So interacting with a GriefBot alters who (and what!) contributes to meaning-making — but it will also likely alter the contents of the stories we tell. When I look at old photos or listen to an audio message, I’m flooded with memories of my mom. Some are crystal clear, others are more abstract, but they’re rooted in my experience of those moments. I can’t see those moments through her eyes. But GriefBots are trained on personal data — i.e. social media posts, digitized diary entries, and (likely, one day) cell phone data, among other information. As a result, when the GriefBot produces an unexpected output, we won’t know if it’s true. In my conversation with Tamara Kneese, author of the great book Death Glitch: How Techno-Solutionism Fails Us in This Life and Beyond, she imagines a scenario when a GriefBot of your grandfather tells you about their extramarital affairs. Or maybe the bot shares a story that contradicts your memory of an event? In either case, it’s not coming from our experience and memories. In turn, unexpected outputs risk destabilizing and contesting our past. Interacting with a GriefBot also risks keeping us there.

☝️ I just want to point out: I am very aware that we will all experience grief and mourn loss in vastly different ways, and that a GriefBot or something similar might be very helpful for some, and repugnant to others. What I want to demonstrate here is how AI systems have the potential to transform mourning to a process that is mediated by probabilistic outputs about the past and present — outputs that we cannot be sure are an accurate representation (whatever that means!).

GriefBots might even amplify the effects of ambiguous loss and prevent healing. Now, I don’t believe in closure or the idea that grieving ends; that there’s a finality to it. While my experience of grief has changed shape over the years, I’ve been able to move forward. Part of that stems from imagining the future and making choices to build it intentionally. As Kessler puts it:

“Every moment we are making choices — whether to move toward healing or to stay stuck in pain. Like all the other stages, the sixth stage of grief requires movement. We can’t move into the future without leaving the past. We have to say goodbye to the life we had and say ‘yes’ to the future.”

I’ve made choices that gave me momentum and nudged me toward the future — for example, I moved to Los Angeles from that tiny Brooklyn apartment and started Untangled.** While I’ve healed along the way, ‘closure’ doesn’t feel like the right word. There wasn’t/isn’t a complete ending to my relationship with my mom. Right, as author Mitch Albom wrote in Tuesdays with Morrie, “Death ends a life, not a relationship.” But we can experience a kind of loss that confuses and blocks resolution. Dr. Pauline Boss coined the term ‘ambiguous loss’. Her book The Myth of Closure, refers to it as “loss that remains unclear and without verification or immediate resolution, which may never be achieved.” There are two types, one in which there is physical presence but psychological absence — a parent with Alzheimer’s disease, for example. The second is when there is a physical absence but a psychological presence — not unlike interactions with a GriefBot.

Subscribed


The ambiguity results from the sense that our loved one is there but not there. Even if we 100% know the chatbot is a synthetic sliver of our lost loved one, we can live in a familiar, comfortable world — one animated by the presence or aliveness of our loved one. Even if they aren’t. As Boss wrote, “Deciding to move forward can be frightening because it makes you feel like you’re losing our loved one not once, but twice. It’s also scary because it requires you to move into the unknown, into a life that is different without that person.” With a GriefBot, we don’t have to decide. We don’t have to move forward because we can pull the past into the present.


But sometimes, an AI-generated artifact might actually unstick this ambiguity. I spoke with

, and she shared that after her third miscarriage, she used the app Remini to create hypothetical photos of what her child might have looked like based on pictures of her and her husband. In our conversation, Dewey explains that creating the image helped to validate her experience of grief.


The question, then, becomes: does the tool make your experience of your lost loved one more real or more ambiguous? If I created a mom-inspired GriefBot and it felt uncannily similar to her, I think I’d feel stuck; that the ‘here but not here’ ambiguity would complicate my grief. And when it glitched, the gap between the output and my memories and expectations would feel jarring. But we all grieve differently — for example, in an episode of the podcast Reply All, a teenager tells the story of using the Sims to grieve her grandmother, and say a final goodbye — a chance she didn’t originally have. And for Dewey, creating the images helped make the otherwise ambiguous loss more concrete, allowing her to grieve, and move forward. In either case, the decision to use or not use one of these tools is where one’s power stops.


In her book, Kneese makes the case that tech platforms now hold tremendous power over how we grieve and mourn, and they aren’t designed for it. Kneese explains that while we often overlook the labor required to maintain these systems, technologists, content moderators, and developers regularly make decisions that mediate our experience of mourning and grieving. What happens when the model starts to glitch and says something the deceased never would or shares something vulnerable but in a cadence or tone the deceased would never use? What happens if the company decides to use the data of your lost loved one for a purpose you don’t like? What if they decide to turn a profit and charge you a subscription fee? What if they go out of business? Or what if they just delete the account of the deceased?


Earlier this year, Marco Polo sent an email to all of its users, saying we would soon lose access to inactive conversations. My heart sank when I received it — I was probably going to lose access to conversations with my mom just because a company wanted to free up storage space. As I read the fine print, I realized that Marco Polo offered a process for those in my situation. I had to submit a copy of my mom’s death certificate, and the videos remain safe in my archive. For now, at least. Marco Polo is a venture-backed company that needs to minimize costs and generate returns. I don’t own those videos — they do. I’m one policy change away from losing a point of connection to my mom. But no company can erase my memories and no technology can capture the sound of her laugh.


*Tamara is also my colleague, but, as a friendly reminder, Untangled is a personal project, and does not reflect the institutional positions of Data & Society Research Institute.


**When my sister and I cleaned out my mom’s houseboat, I found some of my old writing for the Huffington Post, printed out and tucked away in a cabinet. My mom didn’t always understand what I do for a living. Sometimes I think Untangled is my attempt to explain to her what I do.

Comments


Let's Chat

I'm curious about

Find me online

Logo for the newsletter Untangled
Logo for The Atlantic Council
FLL Upcoming Events 2025.png
Logo for Data & Society Research Institute
LinkedIn Logo

© 2024 by Charley Johnson. All rights reserved.

bottom of page