Unregulated AI scribes raise fresh privacy, security and consent issues for older Australians
A personal story that should make us all stop and think
When AI expert Kobi Leins booked a specialist appointment for her child, she was confronted with a condition she found unacceptable: the clinic required the use of an AI-transcription “scribe” during the consultation. She declined. The practice replied the doctor would only proceed if the AI tool was used — and offered her the option to find another practitioner. The appointment was cancelled.
That story may sound like an isolated clash of personalities, but it illustrates a growing trend: older Australians — already navigating complex health issues, multiple medications and chronic conditions — may find themselves directed to healthcare providers who insist on using AI tools. Without meaningful choice, this raises serious questions about consent, data security and the doctor-patient relationship. (Reported by Tom Williams of Information Age)
What are “AI scribes” and what’s changing?
“AI scribes” refer to software systems that listen to parts of a medical consultation (or record it) and then do one or more of the following:
- transcribe the discussion into text
- summarise findings, update the patient’s medical record
- in some cases, analyse the data and propose tasks, next steps or even suggestions.
In Australia about one in four general practitioners (GPs) are reported to be using AI or transcription tools of this kind.
While the promise is attractive — less paperwork for the doctor, more “face-to-face” time with you — the reality is more complex.
Why this matters for older Australians
- More to lose
If you’re over 60, you may already have multiple conditions, tests, specialists and medications. Your records are detailed; you rely on accuracy. An AI tool making a mistake or mis-transcribing could have cascading effects. - Consent and choice
The older patient may feel less comfortable questioning a medical practice, or simply assume that the doctor decides everything. But you do have rights: you should be fully informed about what is being recorded, how data is stored, whether you can refuse the tool and still be seen. The regulator Australian Health Practitioner Regulation Agency (AHPRA) notes patients must give informed consent if recordings or AI tools are used. - Data privacy, cross-border risk
Some AI scribes may be operated by offshore vendors, store data overseas or update functionality without your knowledge. If you’re older and less confident with digital matters, you may not know to ask or check. The Therapeutic Goods Administration (TGA) warns that if the tool analyses or interprets the clinical conversation (rather than simply transcribing) it may qualify as a “medical device” and must meet stricter regulation. - Bias & older-person under-representation
Research shows older adults are often under-represented in the datasets used to train medical AI tools — meaning the AI may not serve your age group as well as younger people.
The regulatory patchwork — and gaps
On the one hand, regulators are moving. The TGA has clarified that some AI scribes must be registered if they perform analysis or treatment-recommendation tasks. The federal Department of Health is reviewing “safe and responsible AI in healthcare” and has published a report highlighting gaps in how current legislation addresses new AI uses.
On the other hand, there remains considerable ambiguity:
- Many scribes are being used as “transcription tools” and fall outside strict device regulation.
- Medical practices are free to choose which AI tools they adopt — and some may mandate use.
- Older patients may feel they have to say “yes” or risk being referred elsewhere.
For example, the peak body Royal Australian College of General Practitioners (RACGP) has issued guidance saying clinicians must retain oversight, check AI outputs and ensure the patient understands when AI is involved.
Key questions older patients should ask before agreeing
When you make your next appointment and the receptionist asks if “your doctor uses an AI tool”, consider asking:
- What exactly will the AI do during our consultation? Will it simply transcribe, or will it make suggestions or analyses?
- Where will my data go? Is it stored in Australia? Who has access? Can I opt-out or still be seen if I decline?
- Who is ultimately responsible for the accuracy of the notes and decisions made? Will the doctor review everything the AI produces?
- Has the tool been reviewed and approved (or registered) as a medical device? Is your practice accountable for any errors caused by the tool?
- What happens if the AI gets it wrong — or mis-represents something we discussed?
What you can do right now
- Before your appointment, ask if the practice uses AI transcription or scribe software.
- If you’re uncomfortable, tell the practice you prefer the traditional method of note-taking. See if they accommodate you.
- Bring someone with you — a partner, friend or advocate — especially if you rely on support in medical discussions.
- Keep your own records. Whether AI is used or not, writing down your symptoms, medications, questions and relevant background helps ensure accuracy.
- If you suspect privacy or data-security shortcomings (e.g., unclear consent, data stored offshore, access by third-parties) you can raise concerns with your practitioner, your super fund (if relevant), or regulator (such as OAIC for privacy).
AI in healthcare holds promise: fewer delays, better records, and more time with your doctor rather than paperwork. It can also speed up the research and analysis by your doctor in live time, quick checks on medications, relevant studies, side effects etc. But for some older Australians especially, the introduction of AI scribes should not come at the cost of choice, clarity or safety. Many won’t mind, but some might and it’s helpful to be aware of it.
As Kobi Leins’ experience shows: when a practice says you must use the AI tool or go elsewhere, that raises a red-flag. Your health decisions, your data, your life experience deserve transparency. Don’t feel pressured — you have the right to ask, to understand, and to choose.
In your next appointment: pause at the question, “Will an AI scribe be used?” Make sure you get the clear answer you deserve — and that your consultation remains your discussion, not the machine’s.
Read more articles on Health here




