Module 3: AI in the Consultation
Lesson 2 of 8~7 min read

Is This Tool Safe?

Five questions before trusting any AI tool

Listen to this lesson

0:00
-:--

A colleague at the PCN meeting mentions a new AI tool their practice is trialling. It saves them twenty minutes per surgery. They ask if you have tried it. You have not. Should you?

Peer recommendation is powerful. When a colleague you trust tells you something works, you take it seriously. That is how good ideas spread in general practice — through word of mouth, PCN meetings, and WhatsApp groups.

But AI tools are not the same as a new consulting technique or a useful website. They process patient data. They record clinical conversations. They interact with your clinical system. The stakes are different. And a colleague’s enthusiasm, however genuine, is not the same as a proper evaluation.

I want to give you five questions. Ask them about any AI tool before you use it in clinical practice. If you can answer all five, you are in a strong position. If you cannot, the tool is not ready — regardless of how impressive it looks.

Question 1: Is it on the NHS England AVT Registry?

NHS England maintains an Assured Validated Technology (AVT) Registry for AI-enabled ambient scribing products. This is the single most important thing to check.

As of January 2026, there are 19 approved suppliers on the registry. These are companies whose products have been assessed against a defined set of clinical safety, data protection, and interoperability standards. They have undergone a formal evaluation process. They have signed data processing agreements with NHS England. Their products have been tested in NHS settings.

Being on the registry does not mean the tool is perfect. It does not mean it will never make errors. But it means the product has met a baseline standard for use in NHS clinical settings. It is the difference between a medication that has been through MHRA approval and one someone is selling on the internet.

How to check: Go to the NHS England website and search for “AI ambient scribing assured suppliers.” The list is publicly available. If the tool your colleague recommended is on the list, that is a good start. If it is not, that is a significant concern. Tools like Tortus, Heidi Health, Accurx Scribe, and Dragon Medical Copilot are examples of products that have been through this process — but always verify against the current list, as it is updated regularly.

Question 2: Is it MHRA registered?

This question is more nuanced than it sounds, and it trips up a lot of people.

The Medicines and Healthcare products Regulatory Agency (MHRA) regulates medical devices in the UK. Whether an AI scribe needs MHRA registration depends on what it claims to do.

A tool that transcribes — converting speech to text — is generally not classified as a medical device. It is a documentation tool, like a digital dictation system.

A tool that summarises, interprets, or suggests clinical actions based on the consultation content moves closer to the medical device boundary. If it generates a differential diagnosis, suggests a safety-netting action, or recommends a clinical code, it may need to be registered as a medical device.

The distinction matters because medical device registration brings additional regulatory oversight, post-market surveillance, and reporting requirements.

Many AI scribing tools do both transcription and summarisation. The regulatory classification depends on the specific claims the manufacturer makes. If a tool claims to assist with clinical decision-making, it should be MHRA registered. If you are not sure, ask the supplier directly and check the AVT Registry documentation.

Question 3: Where does the data go?

This is the question you should already be instinctively asking, based on everything we covered in Module 2.

When the AI tool records a consultation, the audio data has to go somewhere for processing. You need to know three things:

Where is it processed? The data should be processed on UK-based servers. Ideally within NHS infrastructure or on UK sovereign cloud services. If the audio is being sent to servers in the United States, or anywhere outside the UK, that is a governance concern — particularly for identifiable patient data captured during a clinical consultation.

Is there a Data Processing Agreement (DPA)? A DPA is a legal document that specifies what the processor can and cannot do with the data. NHS England’s AVT Registry requires suppliers to have a DPA in place. If a tool does not have one, or if the supplier cannot produce it when asked, do not use the tool for patient data.

How long is the data retained? The audio recording and the transcript should be processed, used to generate the note, and then deleted within a defined timeframe. It should not be stored indefinitely. It should not be used for training the AI model. And it should not be accessible to the supplier’s employees beyond what is necessary for technical support.

These are not unreasonable questions. Any reputable supplier will have clear answers. If the answers are vague, evasive, or missing, that tells you something important about the tool.

Question 4: Has your practice done a DPIA?

A Data Protection Impact Assessment (DPIA) is a formal process for evaluating the privacy risks of a new technology before you deploy it. Under UK GDPR, it is required when processing is likely to result in a high risk to individuals’ rights and freedoms. Recording clinical consultations with AI meets that threshold.

A DPIA is not a form you fill in once and file away. It is a structured assessment that considers what data is being collected, why it is being collected, who has access to it, what the risks are, and how those risks are being managed.

In practice, your Caldicott Guardian or information governance lead should be involved in this. If your practice is using an AI tool that processes patient data and no DPIA has been completed, that is a gap that needs addressing — not eventually, but now.

If your practice is using a tool from the NHS England AVT Registry, much of the groundwork for the DPIA will already have been done at a national level. But you still need a local DPIA that covers your specific practice context: your patient population, your clinical workflows, your data sharing arrangements, and your staff training. A national assurance does not replace local accountability.

Question 5: What happens when it goes wrong?

This is the question most people forget to ask. And it is arguably the most important.

Every technology fails. Software crashes. Networks go down. AI models produce errors. The question is not whether something will go wrong, but what happens when it does.

Three things to check:

Error pathways. If the AI produces an inaccurate note, what is the process for identifying and correcting the error? Is there an audit trail that shows what the AI generated versus what the clinician signed off? Can you see the original transcript alongside the structured note?

Clinical responsibility. The supplier will tell you, correctly, that the clinician is responsible for reviewing and approving the note. But what support does the tool provide for that review? Does it highlight areas of uncertainty? Does it flag when it could not hear something clearly? Or does it present a polished note that makes it easy to click “accept” without thinking?

Audit trails. If a concern is raised about a clinical record — a complaint, a significant event, a medicolegal enquiry — can you reconstruct what happened? Can you show what the AI generated, what you changed, and what you approved? Without an audit trail, you have no evidence of your review process.

If a tool makes it easy to approve notes without reviewing them, that is a design problem — not a feature. A responsible AI documentation tool should make review easy and thorough, not fast and superficial. If the workflow encourages clicking “accept” on notes you have not properly read, you are accumulating risk in your clinical record.

Putting it all together

Five questions. They take less than thirty minutes to answer for any given tool, and they protect you, your patients, and your practice.

1. Is it on the NHS England AVT Registry? 2. Is it MHRA registered (if it needs to be)? 3. Where does the data go? 4. Has your practice done a DPIA? 5. What happens when it goes wrong?

If you can answer all five satisfactorily, you have a tool that has been through proper assurance, handles data responsibly, and has clear processes for when things do not go to plan. You can use it with confidence.

If you cannot answer all five, the tool is not ready for your clinical work. It does not matter how enthusiastic your colleague is. It does not matter how impressive the demo looked. Governance is not bureaucracy. It is patient safety.

In the next lesson, we are going to look at what actually happens inside one of these tools — the journey from spoken word to structured note. Because understanding how ambient scribing works is essential to understanding its limitations.

Key Takeaway

Five questions protect you and your patients. If you cannot answer all five — AVT Registry status, MHRA registration, data processing location, DPIA completion, and error pathways — the tool is not ready for your clinical work, regardless of how impressive the demo looked or how enthusiastic your colleague is.