Focus on AI in healthcare


Early forms of AI have been used clinically for a while – like ECGs that can spot conditions such bradycardia and tachycardia (slow or fast heart rates) using well-established clinical rules.
As AI has evolved, enthusiasm for its possibilities has grown. But so too have the concerns, ranging from issues of control and ownership of data through to possible risks of replacing human interaction. We asked THIS Institute’s Professor of Data Science and Healthcare Improvement, Niels Peek about the potential for AI in the NHS. We also talked to Marcus Lewis, a GP from North London, about the practicalities of some of the AI tools already being used.
The human touch
Niels’ research investigates how new technologies can best support health systems, including patient outcomes and experience. He begins by recognising that innovations are, by themselves, not solutions. “In the context of a highly complex system like the NHS, technical innovation is only the starting point,” he says. “To realise the benefits of innovation, we need to change structures, processes, and behaviour, which is often the greater challenge. Our research focuses on understanding how we can best make changes to achieve the benefits that the technology has to offer for patients and professionals.”
Niels believes that the key potential for AI in healthcare lies in reducing administrative and operational burden – and perhaps less in directly replacing human interaction or expertise. “Administrative and operational processes are usually well understood, well described, and highly structured. And computers are happy to carry them out!”
He explains that “Some AI tools can, for example, help with small but important processes like documenting consultations or collecting information about patients who approach their GP. If tasks like these can be carried out, even partially, by AI, that could potentially free up valuable time from NHS staff.”
But he cautions that “Clinical decision-making is the core of clinical professionals’ jobs, their area of expertise, and often the most interesting part of their work, so they don’t want computers to take it over any more than their patients do.” For example, commenting on his work on how online consultation systems can use AI to identify people who need urgent care, he says, “Typically, while the AI is set to flag potentially urgent or high-risk patients, NHS professionals then use their judgment and experience to decide whether the flagged patients do need urgent care.”
Digital scribes for general practice
One of Niels’ current projects is looking digital scribes – AI tools that record and then transcribe doctor-patient conversations. He sees the possibilities for AI to improve clinical documentation processes and to support collating and summarising clinical information about a patient, but is clear that careful, rigorous evaluation is needed. He’s particularly interested in whether digital scribes do in fact save GP time, and how any time saved gets used.
Marcus Lewis also sees the potential in digital scribes, while emphasising that they require careful clinical oversight.
“You can’t just blindly copy and paste [digital scribes’] output into patient records. They can occasionally ‘hallucinate’ examination findings or symptoms that were never discussed, and I’ve heard of colleagues finding completely fabricated diagnoses documented with confidence. Every word needs reviewing before it goes anywhere near the patient record.”
Marcus Lewis
“That said, when used properly, they do change how we consult in some interesting ways. You find yourself thinking aloud more, narrating your clinical reasoning for the AI, which patients often find reassuring. They can hear you working through their symptoms, making your communication clearer. The obvious benefit is more eye contact – instead of typing whilst talking, you can focus entirely on the person in front of you.”
But there’s an interesting trade-off in terms of documentation, according to Marcus. “The AI output tends to be verbose, over-formatted, and often repetitive. The well-recognised, idiosyncratic notes of a colleague – which you could scan quickly because you knew their style – becomes standardised AI prose that lacks individual voice.” He notes that “You lose the recognisability that helps you understand not just what happened, but how your colleague was thinking about the case. I might save five minutes creating notes, but will my colleague spend extra time tomorrow trying to extract the key clinical information from all that detailed, impersonal formatting?”
The future
Looking at the new technologies becoming available for, and used by, frontline NHS services, many forces are at play. Commercial pressures, ‘technology push’ (developing new technology before there is a demand for it), and the attraction of the new can all lead to adoption, sometimes in a piecemeal way.
Niels stresses that all these innovations require that they are well designed, that work-system design and infrastructure are given proper priority, that governance is appropriate, and that public and staff engagement is strong. For instance, a common concern focuses on what happens to data collected during a digitally scribed consultation. Niels says:
“Many people are uncomfortable with the idea that what they say to a doctor confidentially might be shared with engineers who develop AI – but the law requires that, before this happens, data are anonymised to the point where an individual could never be identified from their data. Once people understand this, they are often reassured.”
Niels Peek
Involving NHS staff and patients in the development and implementation of AI tools is a consistent theme across Niels’ projects. He believes that information technologies, whether AI or not, are about supporting or changing information processes, and so any innovations need detailed understanding of those processes, gained from the people involved, to be successful. He also urges caution about predictive AI, because expectations around what can be predicted are often unrealistic and could lead to extra burden. “Every prediction tool will produce false positives, which is a burden both for the people involved and for the NHS.”
Marcus adds that there’s a skill to getting the most from AI tools. “For instance, AI scribes only capture what’s said aloud, not the clinical hunches, subtle observations, or non-verbal cues that often guide our decisions,” he says. “Learning to articulate those insights to the scribe after the patient leaves is something we’re all having to figure out on our own. So, AI tools work ‘out of the box’ but mastering them requires a kind of training that doesn’t yet exist.”
It may be some time before the benefits of AI in healthcare are realised. Managing the risks may be ongoing challenge. But if tasks like admin and documentation, that computers are so good at, can support clinicians in doing what they do best – talking to patients and making clinical decisions – good evidence to support progress will be critical.