This extended blog is based on an opening address to the Northern Ireland Social Work and Social Care Research in Practice Conference, given in Belfast on the 11th March
There is a moment, when you stand close to a piece of old analogue equipment, and you hear not just the operation of the device but something subtler – a faint hum, a resonance. Engineers used to say that if you listened carefully enough, you could hear the “soul in the circuit.”
Today, in a world saturated with digital systems, artificial intelligence, and learning machines, we risk believing that everything valuable must be generated by computing power or captured in data. And yet – in the quiet spaces between calculation and connection – I believe there is something profoundly human still asking to be heard.
This talk is about that something.
It is about how we care, how we age, and how we hold one another in times of vulnerability. And increasingly, how we do all of this in a world shaped by circuits, algorithms, prediction, and artificial intelligence.
I want to start our journey in a place that may seem very far from social care – the world of global retail. Some time ago, I spent a day with organisations who live at the cutting edge of prediction, personalisation, and customer behaviour, from supermarkets to global online platforms. They spoke about the evolution of digital experience: how they have moved from simply observing consumer behaviour, to predicting it.
For over a decade, retail has been obsessed with personalisation. But we are moving beyond that. We are entering the era of agentic AI – systems that act autonomously on our behalf.
In the world of online shopping, AI now: searches, compares, validates, builds baskets, completes purchases, sometimes without the human ever asking directly.
These are systems acting on our behalf, in our interests, anticipating our needs before we articulate them.
The consumer is becoming the passenger rather than the active operator.
This is not science fiction rather this is commercial reality.
As foreign as that might sound to the world of social care, the truth is this: the individuals we support are consumers in that world. Their expectations have been shaped by it. The way technology anticipates their preferences, curates their choices, tailors their experience – all of this is already forming the baseline from which they enter our care homes, our home care services, our community supports.
But here is the essential difference: in retail, anticipating preference is about convenience. In care, anticipating need is about dignity, identity, risk, autonomy, and life itself.
So, when I began reflecting on these two worlds; the frictionless, hyper‑personalised world of retail, and the intimate, relational, unpredictable world of care, I realised that the comparison is not superficial. It is revealing. Because increasingly, both environments are shaped by the same forces. AI doesn’t stay in one sector; it moves with the human person.
And that brings me to the central question: As AI moves deeper into the world of care, how do we ensure it enhances the human soul, rather than replacing or diminishing it? How do we ensure there is always soul in the circuit?
Let me take you further into what I heard that day. The leaders of these companies explained that consumers no longer want endless choice. They want meaningful relevance. Not a flood of information, but a flow of respect. Not bombardment, but anticipation. Not intrusion but understanding. And they described how their AI systems build subtle psychological profiles, micro‑signals, timing rhythms, all in service of delivering something that feels personalised, responsive, almost relational.
So retail is shifting toward anticipatory systems: AI that predicts need before need is spoken.
When I heard that, I found myself thinking about the individuals in our care homes living with dementia, or people supported in their own homes managing layers of frailty, chronic illness, trauma, or loneliness. Imagine – what could anticipatory systems contribute? Devices that learn sleep disturbance before it spirals into crisis. Sensors that understand gait patterns and predict falls. Tools that identify subtle changes in mood, speech, mobility, or appetite that even the most attentive human might miss.
In a society like Scotland, where ageing demographics and workforce pressures collide, anticipatory care is not just desirable, rather it is essential. It has the potential to reduce crises, prevent unnecessary hospitalisation, and enable people to stay rooted in their homes, their communities, and their identities for longer.
But at what point does anticipation become paternalism? At what point does support become surveillance? At what point does helpfulness become a quiet erosion of autonomy?
That tension sits right at the centre of this talk.
In retail, there is a new term: the omniconsumer.
Retail is not only evolving personalisation; it is dissolving the idea of channels altogether. You are not a user of a platform – you are the platform. Everything flows around you. You become the pivot, the context, the centre point.
In social care we are – quite frankly – nowhere near that level of integration. Instead, we ask older citizens to navigate mazes of disconnected systems, assessments, criteria, budgets, professionals, and agencies. We ask people living with dementia to retell their story every few weeks to strangers. We ask families to do detective work simply to find out what support exists. And we act surprised when those living with complexity feel lost.
Yet the possibility exists – not for a consumerised model of care, but for an ecology of support shaped around the human being, not around institutions or service silos. A model where AI becomes the connective tissue holding information together, reducing friction, amplifying the contribution of the workforce, enabling continuity, and strengthening human connection.
Retail’s omniconsumer is simply a commercial metaphor for what we in care have known for generations:
A person’s life is not a set of disconnected episodes – it is a whole, living, relational story.
And AI allows us, perhaps for the first time, to coordinate care in a way that honours that wholeness.
But with this immense capability comes risk.
For older adults – especially those living with cognitive impairment -agentic AI could begin making decisions for them rather than with them.
We could see: erosion of autonomy, opaque decision-making, surveillance without consent, systems that infantilise rather than empower and a loss of the right to “be” rather than be managed.
And perhaps the greatest risk: that overstretched services begin to rely on AI as a substitute for human presence.
This is where the soul can be lost in the circuit.
None of the positives are inevitable. It requires ethical architecture. And for me, that ethical and human rights architecture has a name: PANEL – Participation, Accountability, Non‑discrimination, Empowerment, and Legality. PANEL is not a checklist; it is a philosophy of practice. It is how we keep soul in the circuit.
Participation means the older person, the individual receiving care, must shape the technology, not simply be shaped by it. Their stories, fears, rhythms, habits, hopes – these must be the raw material from which AI is designed. People with dementia, with learning disabilities, with mental health conditions; they know what helps, what harms, what preserves dignity. They must be co‑creators.
Accountability means there must always be a human answerable for what AI does. AI cannot become an unchallengeable oracle. Decisions must be explainable. Systems must be transparent. No person receiving care should feel powerless in the face of automated judgement.
Non‑discrimination matters deeply because AI inherits the bias of the data it is trained on. And since most datasets are drawn from younger, healthier, more digitally active adults, older people risk being mis-classified, overlooked, or misunderstood. If we do not build inclusively, AI will magnify inequality.
Empowerment is perhaps the most profound. Technology must expand a person’s independence, not diminish it. It must help someone to live the life they want, not the life a system predicts for them. It must create capability, not dependency; confidence, not compliance.
And Legality is the foundation: rights must not lag behind innovation. The frameworks that uphold privacy, consent, autonomy and protection must evolve as quickly as the systems we deploy.
These principles are the spine of ethical care in an age of AI. They are the way we keep the compass steady. They are the way we ensure dignity does not become an afterthought. They are the way we weave humanity into technology, rather than allow technology to erase humanity.
And they matter because the workforce is exhausted. We have a crisis not of compassion but of capacity. People are leaving roles they love because they cannot sustain the burden. They are drowning in documentation, in regulation, in the administrative architecture of care. We ask them to be both sherpa and shield, guide and guardian, healer and administrator.
AI enters this space not because we wish to technologise care but because we wish to preserve it. Used well, AI can remove burden, free time, streamline tasks, amplify decision‑making, and allow carers to spend their time doing the one thing no device can ever replicate — being human. Holding a hand. Listening to fear. Sitting in silence. Bearing witness to a life.
That is not automation; that is liberation.
But we must also face another truth. Loneliness is epidemic. For many older adults, companion technologies, conversational agents, memory‑prompting devices, social robots are arriving as lifelines. We should not be afraid of this. Companionship technologies can stimulate connection, reduce anxiety, support sleep, and provide reassurance in long nights. But they must never become the substitute for human contact. They must be bridges, not walls. They must lead back to community, not away from it.
And this brings me to the heart of the matter.
The measure of success in AI in care is not efficiency. It is human connection.
If AI deepens relationship, strengthens autonomy, widens possibility, and restores dignity then it is ethical. If it replaces presence, narrows choice, or fragments relationship then it is not.
We stand at a moral threshold. Not simply a technological one. The question before us is: Who do we want to become? A society that manages ageing? Or a society that honours it? A sector that uses AI to compensate for scarcity? Or a sector that uses AI to create abundance? A world where human contact is optional? Or a world where technology frees us to be more present than ever?
The truth is simple: technology will be woven into the fabric of future care, into homes, behaviours, decisions, routines, conversations. But whether it diminishes or enhances the human experience is entirely up to us.
The soul is not in the machine.
The soul is in the intention.
The soul is in the relationship.
The soul is in the choice to design with dignity, to govern with humility, to innovate with compassion.
If we hold to PANEL, if we centre autonomy, if we prioritise presence, if we protect story, then we can weave the soul into every circuit. We can build a future defined not by what technology replaces, but by what it restores. Not by data, but by meaning. Not by scarcity, but by relationship.
And that, I believe, is our task. To ensure that as AI accelerates, humanity deepens. To ensure that as systems become more intelligent, we become more compassionate. To ensure that in every algorithm, every interface, every automated moment, the dignity of the human person remains paramount.
We have an opportunity, rare, profound, urgent, to build a future in which technology amplifies what makes us human, rather than eroding it. A future in which ageing is held with tenderness, decision‑making is honoured, vulnerability is protected, and care is celebrated as the heart of community.
Let us choose that future. Let us shape it together. Let us weave the soul into the circuit.
Thank you.


