Clicky

Technology and values in general practice

9 July 2025

Ben Hoban is a GP in Exeter.

There is something familiar about the current discussion of Artificial Intelligence (AI) in general practice, one more change that we must decide either to embrace, ignore or resist.1 Should we view this new technology as a useful addition to our toolkit, or an existential threat to a distinctive way of working? Although it is easy to take sides on principle, depending on whether we see ourselves as more progressive or conservative, it is perhaps worth examining the wider context too.

Historically, technology advances according to a pattern in which artisans using hand-tools are gradually replaced by workers operating machines. Workshops were thus overtaken long ago by factories able to produce goods more quickly and at lower unit cost; the mercury sphygmo-manometer used by doctors for over a century is no longer in service, and anyone can measure their blood pressure by simply pressing a button on a plastic box; a computer was originally a person who carried out computations but is now just another unit on our desks. We might summarise these changes by observing that the essence of what is being done is transferred from a person to a machine.

An artisan demonstrates their skill by creating unique objects that reflect the needs of their client, and a limited rate of work merely increases the desirability of the product.

This internalisation of function from people into machines is also associated with changes in the way systems work more generally and the values they adopt. An artisan demonstrates their skill by creating unique objects that reflect the needs of their client, and a limited rate of work merely increases the desirability of the product. A machine’s competence lies instead in the uniformity and volume of its output, so that as many people as possible get what they want, although they must only want what is available. As Henry Ford supposedly said of the Model T, Any customer can have a car painted any color that he wants so long as it is black.2

Few people would choose to return to an age in which artisans relied on the patronage of the nobility, scarcity kept prices high, and advanced manufactured goods were out of most people’s reach. The effect of technology on society is mixed, however. We can find huge amounts of information online, without necessarily being able to distinguish it from the latest misinformation; we can access all sorts of services through our phone, including healthcare, but only if we have the necessary ability and credit; and we can buy cheap products that we have been taught to want, dispose of, and replace in a wearying cycle that perhaps benefits those running the show more than either consumers or manufacturers. We have certainly advanced technologically in ways that address many of our human needs, but have we perhaps inadvertently assumed that all of them can be met in the same way?

We can see something similar in general practice, where the Quality and Outcomes Framework, the Additional Roles Reimbursement Scheme, and the Modern General Practice Model exemplify the constant drive to increase activity and standardisation. Ambient scribes, widely used AI applications that “listen in” to a consultation and produce a written summary, therefore represent the convergence of similar trends in technology and healthcare.3 We can outsource of writing notes to a machine, freeing ourselves to focus on our patients, just as many of us have largely freed ourselves from the management of long-term conditions, medicines management, acute illness, and musculoskeletal and psychological problems by delegating them to other members of the primary care team with more defined and protocolised roles. The expansion of these non-medical roles in general practice may have bolstered our ability to “deliver appointments” and comply with various guidelines, but it has also increased pressure on GPs, whose workload has become more complex as a result, and who have become responsible for supervising these new colleagues as well as caring for patients.4 If ambient scribes can make us more efficient, experience suggests that we will simply end up seeing more patients.

We might also question the idea that writing notes in general practice is primarily about transcription, however well summarised. This may be what happens in secondary care, where doctors legitimately “take a history,” but they can only do this because the underlying form and content of that history, or story, have previously been agreed by the patient and their GP. Writing up a consultation, or indeed a referral to secondary care, involves multiple decisions to include, exclude, frame, order, and emphasise different narrative elements, based on an underlying set of values and assumptions.5,6 It is a process which is necessarily collaborative and creative, not merely recording facts, but making sense of them. As such, it is inherently artisanal: could a machine write such a story without missing its point?

Despite their differences, though, technology and narrative have something in common. Both represent highly effective ways of engaging with the world around us, the one physically and the other conceptually: just as machines internalise function, so stories internalise meaning. It would be difficult to build a modern car by hand; explaining why a man walks into a bar would rather defeat the purpose of the telling. Machines and stories allow us to compress what we do and mean, not just to carry out low-level activities more efficiently, but to act at a higher level than we otherwise could. We rightly feel the tension between interpersonal care tailored to the needs of individual patients and more generic services, often provided at arms-length and across a whole population, and this broadly reflects the differences between the workshop and the factory. AI seems most at home in the factory, but can the workshop benefit from it too without losing its artisanal character?

Writing up a consultation, or indeed a referral to secondary care, involves multiple decisions to include, exclude, frame, order, and emphasise different narrative elements, based on an underlying set of values and assumptions.

Technology is not the only thing that has advanced. As we recognise increasingly the beautiful complexity of the body and the mind, it becomes easier to see the shortcomings of the biomedical model in helping us understand and care for whole people. Where general practice was once the poor relation of hospital medicine, we have now grown into the role of expert generalists, not just applying biomedicine to diseases, but engaging over time with the narrative of people’s lives and helping them navigate the uncertainties of illness. This higher level of practice is neither every patient nor every doctor’s experience, however.7 Should we lower our expectations, then, or use the tools we have to keep them high?

I understand very little about AI. I secretly worry that the machines we build to save us will one day kill us in our sleep, without knowing whether this is entirely irrational, or whether their environmental impact is in fact a greater threat. I am intrigued, though, that these applications are more formally referred to as Large Language Models because language and stories are the substrate of general practice, and I hope that there is room for this new technology to bridge the gap between the workshop and the factory rather than widening it. The real question is not whether AI can make GPs more efficient, but whether it can uphold the artisanal values of general practice at a time when efficiency seems to be all that matters.

Deputy Editor’s note – see also: https://bjgplife.com/imagining-the-future/

References

  1. Nouf Aldhelaan and Paul McNamara, Artificial intelligence in primary care: a plausible prospect or distant delusion? British Journal of General Practice 2025;  75 (suppl1):  bjgp25X741645. doi.org/10.3399/bjgp25X741645
  2. Henry Ford, Samuel Crowther, My Life and Work, Doubleday, Page and Company, 1922
  3. Stokel-Walker C. The “ambient scribe” tools listening to and summarising your doctor-patient consultations BMJ 2025; 389 :r663 doi:10.1136/bmj.r663
  4. Armitage R. Not helpful but harmful. Br J Gen Pract. 2024 Jul 25;74(745):367. doi: 10.3399/bjgp24X739005. PMID: 39054077; PMCID: PMC11299692.
  5. Senior T. Stories and medical records. Br J Gen Pract. 2025 Jan 30;75(751):85. doi: 10.3399/bjgp25X740721. PMID: 39890115; PMCID: PMC11789795.
  6. John Launer, Narrative-Based Primary Care: A Practical Guide, CRC Press, 1996
  7. Payne R, Dakin F, MacIver E, Swann N, Pring T, Clarke A, Kalin A, Moore L, Ladds E, Wherton J, Rybczynska-Bunt S, Husain L, Hemmings N, Wieringa S, Greenhalgh T. Challenges to quality in contemporary, hybrid general practice a multi-site longitudinal case study. Br J Gen Pract. 2024 Dec 26;75(750):e1-e11. doi: 10.3399/BJGP.2024.0184. PMID: 39117426; PMCID: PMC11583039.

Featured Photo by Alex Knight on Unsplash

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Latest from BJGP Long Read

‘The care’ or ‘To care’. What would you want?

"The district nurses with their daily dressing changes, the carers with their thrice-daily visits to feed, wash, and comfort her, and her amazing neighbour have done much more than me ... However, I have tried to be there." Emma Ladds reflects on

Schrödinger’s consultation

Let us imagine for a moment a consultation involving such a box, whose contents are not merely unknown, but as yet undetermined. It would perhaps be easier ... if we could look inside, although neither wants to be responsible for sealing poor
0
Would love your thoughts, please comment.x
()
x