Module 4: Making AI Work in Your Practice
Lesson 7 of 7~8 min read

Sustaining and Scaling

From pilot project to practice standard

Listen to this lesson

0:00
-:--

The pilot worked. The rollout is done. The team is using the tool. Now what? This final lesson is about the long game — keeping AI working well in your practice for years, not just weeks, and sharing what you have learned with others.

Most guides on AI implementation end at the rollout. They assume that once the tool is deployed, the job is done. In reality, that is when the real work begins.

Technology changes. Staff change. Clinical guidance changes. The AI tool that works perfectly today will need updating, re-evaluating, and potentially replacing. Your practice needs structures that sustain good AI use over time, not just for the duration of a project.

Building it into your governance

AI should not be a standalone project. It should be embedded in your existing clinical governance framework.

Add AI to your clinical governance agenda. Every monthly or quarterly governance meeting should include a brief AI update. Review the measurement dashboard from Lesson 5. Discuss any incidents or near-misses. Share learning. This takes five minutes and keeps AI on the radar.

Include AI in significant event reviews. When a significant event involves AI documentation — whether the AI caused the problem or was a bystander — include the AI dimension in your analysis. Was the note reviewed? Was the AI output accurate? Did the review process catch the issue?

Update your policies annually. Your AI documentation policy, your DPIA, your patient information — review them at least once a year. Check that they still reflect your current practice. Update them when the tool is updated or when national guidance changes.

Assign the annual AI governance review to a specific month. January is good — it coincides with many practices’ annual policy review cycle. Put it in the calendar now and it will not be forgotten.

Handling staff changes

People join and leave practices. Each change is a moment where AI competence can slip.

New starters. Add AI documentation training to your induction checklist. Every new clinician should complete the same training your existing team received: tool mechanics, note review skills, consent process, and governance awareness. Do not assume that because they used AI at their previous practice, they know your practice’s approach.

Locums. This is a particular challenge. Locums may not have used your AI tool before. They may have their own preferred tools. Your practice needs a clear policy: locums use the practice’s approved tool or document manually. Brief them at the start of their session. Provide a one-page quick-start guide.

Registrars. GP registrars are often more comfortable with AI than their trainers. That is fine — but comfort is not the same as competence. Ensure registrars understand the review process, the governance framework, and the professional responsibility implications. Their training log should include AI documentation competence.

Leavers. When a clinician leaves, review their recent AI-assisted notes as part of the handover. Ensure any outstanding corrections or follow-ups are completed.

Staying current with the technology

AI tools evolve rapidly. The tool you adopt today will be updated multiple times over the next year. New features will appear. Existing features will change. Some updates will improve things; others will introduce new challenges.

Supplier updates. Stay in touch with your AI tool supplier. Read their update notes. Test new features before rolling them out to the whole team. If a major update changes how the tool works, consider whether retraining is needed.

AVT Registry changes. NHS England updates the AVT Registry periodically. New tools are added. Requirements evolve. Check annually that your tool is still listed and that no new conditions have been attached to its approval.

National guidance. The BMA, RCGP, GMC, NHS England, and ICBs are all producing guidance on AI in primary care. Much of it is still emerging. Designate someone in your practice to monitor key publications and flag anything that affects your AI use.

You do not need to be on the cutting edge. You need to be current. There is a difference. Being current means using approved tools according to current guidance. Being on the cutting edge means experimenting with unproven technology. For clinical use, current is safer than cutting edge.

Scaling beyond documentation

Once your practice has successfully embedded AI documentation, you may want to explore other AI applications. This is natural and healthy — but apply the same disciplined approach.

AI-assisted triage. Some practices are exploring AI tools that help prioritise patient contacts. These carry higher clinical risk than documentation tools and require more rigorous evaluation.

Population health analytics. AI can help identify patterns in your practice population — patients at risk of hospital admission, groups overdue for screening, prescribing trends. These tools are lower risk but need careful data governance.

Clinical decision support. AI tools that suggest diagnoses, recommend investigations, or flag drug interactions are moving closer to the medical device boundary. Evaluate these against even stricter criteria than documentation tools.

For each new AI application, go back to basics: Is it on the AVT Registry or equivalent? Has a DPIA been completed? Has the team been trained? Is it being measured? The framework you have built for documentation tools applies to every AI tool your practice will ever use.

Sharing what you have learned

One of the most valuable things you can do is share your implementation experience with other practices.

Your PCN. Practices within your Primary Care Network are likely at different stages of AI adoption. Your experience — the things that worked, the mistakes you made, the governance templates you created — can save them significant time and effort. Offer to present at a PCN meeting.

Your ICB. Integrated Care Boards are actively looking for examples of good AI implementation in primary care. Your measurement data, your training materials, and your governance documentation may be useful at system level.

Your professional networks. GP forums, social media groups, LMC events, RCGP local faculties — these are all places where your experience can help colleagues who are just starting out.

The practices that implement AI well tend to be the practices that communicate well. The skills you have developed through this implementation — planning, governance, training, measurement — are leadership skills. They are transferable to any change project your practice undertakes.

Looking ahead to Module 5

In this module, we have focused on making AI documentation work in your practice. We started with why you need a plan, built the plan step by step, trained the team, integrated the tool into workflows, measured the impact, troubleshot the problems, and set up structures for long-term sustainability.

Module 5 will take us beyond the consultation room entirely. We will look at how AI can help with the operational side of general practice — workflow automation, population health management, correspondence handling, and quality improvement. These are the tools that do not just save clinicians time but help the whole practice work more effectively.

But that is for next time. For now, take stock of where you are. If you have worked through this module thoughtfully, you have the knowledge and the framework to lead AI implementation in your practice. That is a significant professional achievement.

AI in general practice is not a destination. It is a journey that every practice will navigate at its own pace. What matters is that you navigate it deliberately, safely, and with your patients’ interests at the centre. That is what good doctors have always done, with every new tool and every new challenge.

Key Takeaway

Sustaining AI in your practice means embedding it in your governance, preparing for staff changes, staying current with technology updates and national guidance, and sharing your experience with others. The framework you have built for AI documentation — Decide, Prepare, Pilot, Embed — applies to every AI tool your practice will adopt. AI implementation is not a one-off project; it is a new area of ongoing clinical leadership.

Reflect on Your Learning

These questions are designed for your CPD appraisal portfolio. Use them to reflect on what you have learned in this module and how it applies to your practice. You can copy or screenshot your answers as evidence of self-certified CPD.

  1. Does your practice have a written AI policy? What would it need to include?
  2. Who are the Enthusiasts, Pragmatists, Cautious, and Resistant in your team? How would you approach training each group?
  3. What three metrics would you track to measure the impact of AI scribing in your practice?

Approximate CPD time for Module 4: 2 hours (including listening, reading, and reflection).