A patient walks in with a printed sheet — three pages of questions, prepared with ChatGPT. Is this a problem? Or the best-prepared patient you’ve seen all week?
Your patients are already using it
Around 40% of UK adults have used generative AI. A quarter of those have used it for health-related questions. That means roughly one in ten of the patients you see this week may have already consulted ChatGPT about their symptoms.
They’re not doing this because they don’t trust you. They’re doing it because you’re available for ten minutes and ChatGPT is available at two in the morning when the worry kicks in.
Some of what they’re doing is genuinely helpful: understanding medical terminology after receiving a hospital letter, preparing questions before an appointment, learning about a condition they’ve already been diagnosed with, translating complex medical language into something they can explain to their family.
A patient who walks in having researched their condition, who has thought about what they want to ask, who understands the basic terminology? That patient is going to get more out of their ten minutes with you. That’s not a threat. That’s an opportunity.
What clinicians are doing right now
These tools can save you significant time, as long as you use them for the right tasks. The current NHS position is that patient-identifiable information should not be entered into commercial AI tools like ChatGPT, Claude, or Gemini. We’ll cover the data protection rules in detail in Module Two. But the principle is simple: do not paste patient data into these tools. The opportunities below do not require you to.
Drafting practice protocols. Need a protocol for annual chronic kidney disease reviews? AI can produce a structured first draft based on NICE guidelines in three minutes. You verify the clinical detail, adjust it to your practice workflow, and circulate it to the team. What might have taken an hour to write from scratch takes fifteen minutes to review and refine.
Understanding published guidelines. NICE guidelines can be long and dense. You can ask AI to explain the key recommendations from a guideline, compare what two guidelines say about a topic, or outline the main points relevant to your practice. You’re asking the AI about publicly available information, not pasting guideline text into it. And you still verify the output against the original guideline.
Creating generic patient education materials. A leaflet about managing type 2 diabetes. An explanation of what a blood pressure reading means. Information about stopping smoking. As long as the content is generic rather than personalised to a specific patient, and you verify the clinical accuracy before using it, this is a valuable use of the technology.
Exploring clinical questions. What are the NICE recommendations for managing gout? What are the common side effects of dapagliflozin? How does the two-week-wait pathway work for suspected upper GI cancers? AI gives you a more structured answer than a search engine, but you still verify against proper sources like NICE, the BNF, or Clinical Knowledge Summaries.
None of these uses require you to trust the AI blindly. Every single one involves you reviewing the output. You remain the clinician. The AI is doing the first draft. You’re doing the quality control.
And the tasks that do involve patient data — summarising discharge letters, drafting referral letters, documenting consultations? Those are coming. NHS-approved tools with proper data protection agreements are being developed and piloted. Ambient scribes that record consultations within a secure framework are already in trial. But until your practice has approved a specific tool for use with patient data, stick to the tasks that don’t require it. There are plenty of them.
A personal example
I’m a GP. I’m not a software developer. I have no background in computer science.
And yet, using AI tools, I’ve built thirteen healthcare tools. Patient education platforms. Blood test explainers. Clinical calculators. A cardiovascular risk assessment tool. All designed for UK general practice. All using NHS-appropriate language and guidelines.
Three years ago, this would have been impossible for someone with my skill set. AI didn’t do this for me. I still had to understand the clinical content. I still had to design the user experience. I still had to test everything obsessively. But AI removed the technical barrier that would have stopped me from even trying.
Your clinical experience is not being replaced. It’s being amplified. You know things that no AI knows. You understand context that no algorithm captures. AI just gives you new ways to apply that knowledge.
What the NHS is exploring
At an organisational level, the NHS is cautiously exploring several AI applications:
Triage. AI systems that help prioritise patients based on symptom descriptions. Early trials with mixed results — they work reasonably well for straightforward presentations but struggle with nuanced, multi-problem patients.
Medical imaging. AI analysis of chest X-rays, retinal scans, and skin lesions. Probably the most mature area of healthcare AI — genuinely impressive for specific, well-defined tasks, but still a decision support tool, not a replacement.
Population health. Using AI to identify patterns in practice data — patients overdue for screening, cohorts at highest risk of admission, gaps in chronic disease management. Promising but depends heavily on data quality.
Administrative tasks. Automated coding, document processing, appointment management. This is where AI could have the biggest immediate impact on workload — not in the consultation room but in the back office.
All of these come with caveats: early stage, dependent on good data, requiring human oversight, not yet proven at scale in UK general practice. But the direction is clear.
The principle to hold onto
AI is useful when it handles the repetitive, time-consuming tasks that take you away from actual patient care. Drafting protocols. Understanding guidelines. Creating educational materials. Exploring clinical questions.
It is not useful as a replacement for clinical judgement, examination, or the human relationship at the heart of general practice.
The question isn’t whether AI will be part of healthcare. It already is. The question is whether we shape how it’s used — or whether we let it happen to us.
Key Takeaway
AI is most useful for non-patient-data tasks like drafting protocols, exploring clinical questions, and creating educational materials. NHS-approved tools for patient data are coming — but until then, stick to tasks that don’t involve identifiable information.