Letting go: grieving in the age of Ai.

In the aftermath of the Scottish Bereavement Charter Conference on the Future of Bereavement in Scotland – held a couple of days ago and which gathered a very thoughtful range of participants and a range of workshops, including the inspiring contributions of Debbie Kerslake of Birmingham – the growing intersection of AI and grieving deserves I think some wider consideration and reflection.

In my previous writing on “synthetic resurrection,” I invited readers to consider how emerging digital technologies reshape landscapes of memory and grief. At its heart, this practice asked: How might we hold the dead? And how might technology both support and challenge the dignity of that process?

Now, the News Agents piece “Would you use AI to talk to your dead relatives?” and an intriguing phone-in conversation on LBC just last week – brings that philosophical question into the everyday. It introduces Reflekta, an AI-driven service that enables families to create digital avatars of loved ones – fed by their memories, voices, and texts – and engage in ongoing conversation for a subscription fee. Some find comfort and continuity in this, especially those wishing to preserve stories (as Reflekta’s founder Greg Matusky does, using the voice of his WWII veteran father.)

Yet others, like reporter Lewis Goodall, warn of potential exploitation – cautioning that grief is not a product to be monetised, and that synthetic voices may hinder the natural process of letting go. I would tend to agree with Goodall because while Reflekta’s founder hopes to preserve family stories, there are obvious critics concerned about the ethical and emotional risks: worried that simulating a person risks substituting reality with “fiction” – a phantom that may hinder natural grieving and closure.

At this week’s bereavement conference, Debbie Kerslake joined me to explore AI and bereavement, placing technological innovation within a broader human rights and community context. She is also the Director of Brum YODO’s whose work- organising creative, communal events like A Matter of Life & Death Festival– underscores the power of shared ritual and story to process loss, foster connection, and honour forms of grief that resist digital reanimation.

Brum YODO’s ethos reminds us: talking about death, creating spaces for remembrance, imagining communal memorials- those are the foundational practices for healthy bereavement. They resist loneliness, stigma, and commodification of grief. Yet many of these accepted approaches including our own excellent Good Life, Good Death, Good Grief programme in Scotland, asks a basic yet fundamental question: Can Ai support healthy grieving?

The short answer is probably that Ai may provide narrative preservation, but cannot replace relational, embodied grieving.

The emerging scholarship on deathbots or AI-driven chatbots that emulate the presence of the deceased- so-called Interactive Personality Constructs of the Dead (IPCDs)- highlights a troubling concern: they may replicate some sense of presence but miss the profound intersubjectivity of real relationships, risking unresolved grief or emotional confusion.

For me at a very practical level at the heart of such griefbots lies a fundamental limitation: every response is based on static, past data- the recorded voice, texts, anecdotes of the person who has passed away. These AI reproductions cannot account for the reality of change – a loved one’s voice, attitudes, or advice would inevitably evolve if they were still alive- facing new world events, personal growth or even shifting relationships. They ignore the truth of personal development. Growth and wisdom occur continuously. We expect differently from those who remain with us. Ai chatbots cannot simulate these future selves of the deceased – they are frozen in time.

Thus, what such technology offers is not an evolving, relational presence, but a nostalgic snapshot- a memory preserved, not a relationship nurtured.

While Ai might help preserve memories- even offer solace in loneliness- its limitations lie in the lack of authenticity – a digital proxy cannot truly feel, respond, or evolve as the real person did. They risk dangerous dependency and overreliance on an AI persona may inhibit moving forward in a healthy way. And they also raise profound ethical concerns.

There are, also, other associated psychological risks with griefbots which the literature increasingly highlights – simulating a continuing bond may prolong the liminal state in which the deceased feels not fully gone. Grief is a process; transformation involves internalising the memory not preserving external echoes. Researchers also warn of technological “hauntings” – unsettling, unregulated presence of simulated voices that disrupt mourning and could induce emotional harm.

These AI systems often come with subscription models and algorithmic designs that reinforce emotional dependency, not healing. The bots optimise for engagement, risking manipulation of grief for profit. Monetising grief, misrepresenting the dead, and blurring lines between memory and fiction are very real risks. What happens, for instance, if your money runs out and you cannot continue the monthly subscription?  Whilst Reflekta have indicted there are use limitations and safeguards, they cannot guarantee against the risks of dependency. Such dependency may reduce the bereaved individuals’ emotional autonomy. If removing the bot means losing a vital connection, users become bound to the technology rather than free to grieve naturally.

More worryingly are the risks of anthropomorphism and what is called the ELIZA effect. This is where users inadvertently project human traits onto the chatbot, conflating simulation with sentience, which can deepen confusion and emotional reliance.  At its worst level this can be quite damaging. The well documented recent case of where a man in New York threatened to jump off a building after a recent divorce created intensive dependency upon a bot is a case in point.

So up till now I have been very negative and cautionary about the use of Ai in grief and bereavement support – but there are some, I would suggest, potential areas of use and benefit, if properly framed and boundaried.

AI might support archival storytelling – capturing memories, organising testimonials, preserving stories for future generations- but must not aim to replace the dynamic relationship or the evolving self. Such memory archives are not chat bots but rather act as the repositories of multimedia memories. There is also potential use in legacy tools and in encouraging reflection, encouraging families to remember together- without ongoing simulated interaction. But all of this has to be framed within ethical design and all practice, I would suggest has to ensure transparency, autonomy, consent, and ideally an “off button.”

AI in bereavement occupies a fragile tension between preservation and potential disruption. It can store voices and stories- but cannot embody relational depth. Communities like Good Life, Good Death, Good Grief and others remind us that grieving is not about simulation, but about shared humanity, ritual, and respect. Ai can support healthy grieving but only if held lightly, ethically, and sacrificially- subservient to the human, not the other way around. I am not at all convinced the bots we have seen and the products currently on the market are achieving that to date.

I leave you with the sage words of the great Mary Oliver, in ‘When Death Comes.’

When death comes

like the hungry bear in autumn;

when death comes and takes all the bright coins from his purse

 

to buy me, and snaps the purse shut;

when death comes

like the measle-pox

 

when death comes

like an iceberg between the shoulder blades,

 

I want to step through the door full of curiosity, wondering:

what is it going to be like, that cottage of darkness?

 

And therefore I look upon everything

as a brotherhood and a sisterhood,

and I look upon time as no more than an idea,

and I consider eternity as another possibility,

 

and I think of each life as a flower, as common

as a field daisy, and as singular,

 

and each name a comfortable music in the mouth,

tending, as all music does, toward silence,

 

and each body a lion of courage, and something

precious to the earth.

 

When it’s over, I want to say all my life

I was a bride married to amazement.

I was the bridegroom, taking the world into my arms.

 

When it’s over, I don’t want to wonder

if I have made of my life something particular, and real.

 

I don’t want to find myself sighing and frightened,

or full of argument.

 

I don’t want to end up simply having visited this world.

When Death Comes | Library of Congress

 

Donald Macaskill

Photo by Steve Johnson on Unsplash