The patient is talking about relationship difficulties. The language is vague. You ask a direct question about safety at home. The AI is recording.
Every GP knows that some consultations are different. The patient who finally discloses domestic abuse after three appointments about “headaches.” The teenager who mentions self-harm for the first time. The parent whose story does not quite add up when you ask about the child’s bruise. These consultations require a particular kind of clinical skill — careful language, active listening, and precise judgement about what goes into the record and how.
AI scribes do not change the fundamentals of these consultations. But they do add a new layer of complexity. There is now a system listening to and transcribing every word in real time. And the decisions you have always made about clinical documentation — what to include, what to summarise, what to record elsewhere — now need to account for an intermediary that does not understand context, nuance, or the difference between what was said and what should be written down.
This lesson is about maintaining your clinical judgement and your patient’s trust in the consultations where it matters most.
Mental health disclosures
When a patient discloses suicidal ideation, self-harm, or a mental health crisis, the language they use is often tentative, fragmented, and deeply personal. They might say, “I’ve been having some dark thoughts,” or “Sometimes I wonder if everyone would be better off without me.” The clinical task is to explore this sensitively, assess risk, and document appropriately.
An AI scribe will attempt to transcribe this conversation verbatim or summarise it into clinical language. Both approaches carry risks. Verbatim transcription captures the patient’s raw words — words they may have chosen carefully, words that carry emotional weight, words they might not want preserved exactly as spoken in a permanent medical record. Clinical summarisation, on the other hand, may strip the nuance. “Patient reports passive suicidal ideation without plan or intent” is clinically accurate but does not capture the hesitation, the context, or the fact that it took the patient twenty minutes to say it.
There is also the risk of the AI getting the risk assessment wrong. If the patient says, “I wouldn’t actually do anything,” the AI might record “No active suicidal intent” — but your clinical assessment, based on tone, body language, and the broader context, might be more cautious than that. The AI captures words. You assess the person.
AI scribes capture spoken language. They do not assess clinical risk. A patient’s words and your clinical assessment of those words are two different things. The record should reflect your assessment, not just the transcript.
In these situations, consider whether you want the AI scribe running at all during the most sensitive parts of the conversation. You can pause the recording, conduct the sensitive discussion, and then write your clinical summary manually. This is not a workaround. It is good clinical practice.
Domestic violence and abuse
Domestic abuse consultations are among the most complex in general practice. The patient may be disclosing for the first time. They may be accompanied by the perpetrator in the waiting room. The language is often coded or indirect. And the clinical record has specific legal and safeguarding implications that go far beyond a routine consultation.
Consider what an AI scribe records when a patient says, “Things have been difficult at home. He gets angry sometimes. It’s not that bad.” An AI will transcribe or summarise those words. But the clinical task is much more complex. You need to assess safety, explore the nature and frequency of the abuse, consider the involvement of children, and decide what goes into the medical record — knowing that this record could be subject to a Subject Access Request, disclosed in legal proceedings, or read by anyone with access to the patient’s notes.
There are situations where the exact wording matters enormously. If a patient says, “He hit me last night,” and the AI summarises this as “Relationship difficulties — reports physical altercation,” the clinical and legal significance has been softened. Conversely, if the patient speaks indirectly and the AI interprets “He gets frustrated” as “Patient reports partner aggression,” it has added a clinical interpretation the patient did not make.
In domestic abuse cases, the medical record may be accessed through Subject Access Requests, court proceedings, or safeguarding processes. Every word matters. Do not rely on an AI scribe to get this right. Write these notes yourself.
The safest approach for domestic abuse disclosures is to pause the AI scribe and document the consultation yourself. Use your clinical system’s coding and free-text fields to record the disclosure accurately, using the patient’s own words where appropriate, with your clinical assessment clearly distinguished.
Safeguarding concerns
Safeguarding documentation has specific requirements that AI scribes are poorly equipped to meet. When you identify a safeguarding concern — whether for a child or a vulnerable adult — the medical record needs to capture not just what was said, but what you observed, what concerned you, and what actions you took. It often needs to be coded in specific ways, recorded in specific templates, and shared through specific channels.
An AI scribe might capture the content of the conversation reasonably well. But it will not know to use the correct safeguarding codes. It will not flag the consultation for multi-agency review. It will not distinguish between information that should be in the main record and information that should be in a safeguarding-specific template. And it certainly will not apply the professional judgement about what not to record — information that, if visible in the main record, could compromise the safety of the child or adult at risk.
There is a further risk. If safeguarding concerns emerge during a consultation that the AI is transcribing, the full transcript — including any indirect references, partial disclosures, or your exploratory questions — may be stored in the AI system’s processing pipeline. Depending on your practice’s data processing agreement with the AI vendor, this data may be stored, even temporarily, outside the secure clinical record. For safeguarding cases, this is an unacceptable risk.
Develop a habit: whenever a consultation takes a turn towards safeguarding, pause the AI scribe immediately. You can always summarise the earlier, non-sensitive part of the conversation from the AI note and add your safeguarding documentation manually.
This is not about being overly cautious. It is about recognising that safeguarding documentation has legal, ethical, and procedural requirements that no current AI scribe is designed to meet.
When and how to pause
You have always had the ability to control what goes into the medical record. AI scribes do not change that. But they do require you to be more deliberate about exercising that control, because the default has shifted. Without an AI scribe, nothing is recorded unless you write it. With an AI scribe, everything is recorded unless you pause it.
There are several situations where pausing the AI scribe is good practice. Before asking screening questions about domestic abuse, self-harm, or substance use. When a patient’s disclosure shifts unexpectedly into sensitive territory. When the patient is visibly uncomfortable with the recording. When you are exploring a safeguarding concern. When the conversation involves third parties who have not consented to recording.
The mechanics of pausing vary between systems — some have a physical button, others a software control, some require you to say a specific phrase. Whatever the mechanism, you should be able to do it smoothly and without making the patient feel that something has gone wrong. A simple statement works well: “I’m going to pause the note-taking system for this part of our conversation, so we can talk more freely.”
Patients generally respond well to this. It signals that you are taking their disclosure seriously, that you are prioritising their comfort, and that you understand the sensitivity of what they are sharing. Far from undermining trust, the act of pausing can strengthen the therapeutic relationship.
After the sensitive part of the conversation, you have several options. You can resume the AI scribe for the remainder of the consultation. You can keep it paused and document the rest manually. Or you can use the AI note for the non-sensitive portions and add your own documentation for the sensitive elements. The right approach depends on the specific consultation.
What belongs in the record
The question of what to record in sensitive consultations is not new. GPs have always exercised judgement about this. The core principles remain the same: record clinical facts, your assessment, and your actions. Record enough to ensure continuity of care. Record what is necessary for the patient’s safety and for any legal or safeguarding obligations. Do not record unnecessary detail that could cause harm if disclosed.
What the AI scribe adds to this equation is the risk of over-documentation. An AI transcribing a sensitive consultation may capture every hesitation, every exploratory question, every tentative statement — details that you would never normally include in the clinical record. This over-documentation can be harmful. A verbatim transcript of a patient describing domestic abuse, for instance, could be deeply distressing if read back during legal proceedings or a Subject Access Request.
The principle is straightforward: the clinical record should contain your professional summary of the consultation, not a transcript of it. This has always been true, but it becomes particularly important when an AI system is generating that first draft for you.
For sensitive consultations, your manual documentation should include: what the patient disclosed (using their own key phrases where clinically important), your clinical assessment of risk, the actions you took or plan to take, any referrals made, any safety planning discussed, and any follow-up arranged. It should not include a verbatim account of the conversation, unnecessary personal details about third parties, or speculative notes that go beyond your clinical assessment.
You have always had the professional responsibility — and the professional right — to determine what goes into the medical record. AI scribes do not remove that responsibility. They make it more important that you exercise it deliberately.
Patient choice and control
Patients have the right to decline having an AI scribe active during their consultation. This is not a theoretical right — it should be a practical, accessible option. Your practice’s consent process should make this clear, and any patient who declines should receive exactly the same standard of care without any suggestion that they are being difficult or causing inconvenience.
Some patients will not want to decline the AI scribe for the whole consultation but may want it paused for specific parts. This is entirely reasonable. A patient might be comfortable with the AI recording their medication review but not their discussion about anxiety. Respecting these preferences is part of patient-centred care.
Be particularly attentive to non-verbal cues. A patient who agreed to AI recording at reception may become uncomfortable when the conversation turns personal. They may not feel confident enough to ask you to pause it. Watch for changes in body language, increased hesitation, or a shift to vague language — these may signal that the patient is self-censoring because of the recording.
Remember too that some patients may not fully understand what the AI scribe is doing. “The computer takes notes” does not convey the same information as “a system is recording and transcribing everything we say.” Informed consent requires genuine understanding, especially for consultations that may involve sensitive disclosures.
The goal is not to make patients anxious about the technology. It is to ensure they feel safe enough to tell you what you need to know to help them. If the presence of an AI scribe — or any technology — inhibits that, the technology should be paused. The consultation always comes first.
Key Takeaway
You have always had the power to pause, summarise, or omit sensitive details from the medical record when it protects the patient without compromising care. AI scribes do not remove that power or that responsibility. When in doubt, pause.