By Ian Robertson UK & Ireland Director, Tandem Health
The conversation around artificial intelligence in healthcare often swings between two extremes: utopian promises of transformation and dystopian warnings of obsolescence. But for those of us working on the ground, the reality is far more pragmatic. AI will not save the NHS, solve systemic underfunding or magically fix the workforce crisis. But it could, if deployed responsibly, save clinicians from drowning in the administrative tide that is pushing many of them to the brink.
Let me be clear. The NHS is not failing because its clinicians are underperforming. It is under strain because the system expects them to do more than is humanly sustainable. GPs and hospital doctors are not just seeing patients. They are secretaries, data entry clerks, auditors and coders. In fact, our “Time to Care” report found that nearly 40 percent of a GP’s day is spent on non-clinical tasks. That is not just inefficient, it is dangerous.
Used wisely, this is where AI can help. Rather than diagnose, the Tandem medical scribe listens during consultations and generates clinical documentation in real time. It does not replace clinical reasoning. What it does is take a task that has become a major source of burnout and make it faster, easier and safer.
For all its promise, AI is not a silver bullet. If we see it as a fix-all, we’ll quickly run into frustration and unmet outcomes. If we see it as a tool to address specific, well-defined problems, like documentation overload, it becomes genuinely powerful.
‘Transforming care.’ ‘Revolutionising delivery.’ These are the phrases we often hear when AI is pitched to the NHS. But without clarity on workflow, integration, safety and trust, they risk becoming empty promises. And above all, we must remember that healthcare is a human profession, not just a technical one.
We need to stop asking what AI can do in theory and start asking what it can do today. Can it write a coherent clinic letter that a patient understands? Can it reduce after-hours admin so a GP can get home on time? Can it make room for a conversation that might otherwise be rushed? If the answer is yes, then it is worth our attention.
Some worry that AI could erode clinical skills. In reality, the bigger challenge today is that highly trained professionals are spending hours on admin tasks like copying discharge codes. AI doesn’t take away clinical expertise – it protects it by giving time back to focus on patient care.
Concerns around hallucinations and data privacy are also valid, which is why safety and compliance must be baked in from the start. Tandem doesn’t store audio, we don’t train on patient data, and our models are clinically validated before deployment. These aren’t extras – they’re essentials.
Still, the real question isn’t just whether AI is safe, but whether we’re creating the conditions for it to succeed. For AI to be genuinely helpful, clinicians need more than just tools. They need time, trust, and technology that reduces friction rather than adds to it.
We also have to be pragmatic about scale. The NHS is a collection of systems and every trust has its own realities – from legacy tech to workflows that have evolved over decades. Rolling out new tools requires patience, funding and a deep understanding of clinical reality. It also requires us to listen to the clinicians who will use these tools, and to the patients who expect safe, empathetic care.
I don’t think AI is the future of medicine, people are. Doctors, nurses, carers. AI is just one of the tools that can give them more time, more headspace, and the support they need to care properly for their patients. So no, AI might not save the NHS on its own. But it could help protect something just as important: the time and space for a doctor to really listen, to think clearly, and to connect. And in a healthcare system under constant strain, that feels more valuable than ever