The Stepford Scribes: A suspicion of Gender Bias in Healthcare AI – BJGP Life
/

The Stepford Scribes: A suspicion of Gender Bias in Healthcare AI

Shier Ziser Dawood is a GPST2 training in North West London, currently undertaking an integrated training post on the use of AI in general practice.
As a GPST2 trainee, I recently began a novel integrated training post exploring AI use in General Practice. It is the first of its kind in our program, and with much excitement I tore open a new notebook, dusted off a pen and got to work learning about the exciting new world of medical AI.

Heidi, Annie, Tali, Suki, Lindy. These were the grammatically feminine names of electronic scribes being presented by a grid of entirely male CEOs or company representatives.

I began by watching a webinar comparing top AI medical scribes, products aimed at saving clinicians time when recording those all-important patient interactions. I jotted down names of companies and their scribes, when a realisation came that somewhat dulled my initial excitement. Heidi, Annie, Tali, Suki, Lindy. These were the grammatically feminine names of electronic scribes being presented by a grid of entirely male CEOs or company representatives. Not all had ‘female’ scribes, but those that didn’t often had integrated assistants with similarly gendered monikers. It dawned on me that this brave new world of medical AI had more than a whiff of medical misogyny.

That evening I attended a live event on the future of technology in general practice. For a tech event, the gender distribution in the room seemed average – roughly one-third female to two-thirds male has been quoted.¹ However, the panel discussion featured 100% male representation. When the floor opened to questions, I submitted: “In 2023 60% of medical students in the UK are female, yet medical AI companies seem to be led almost exclusively by males. How can we encourage more female participation in the industry?”
My question received zero upvotes from attendees but was addressed to the panel regardless, by the moderator who believed it was a pertinent topic. I felt an awkwardness in the room as doctors shifted in their seats, perhaps looking around to identify who had asked this.
The discussion reflected (or possibly deflected) that he issue isn’t just women; lots of minorities aren’t represented in AI. The panel were divided on the magnitude of the issue, some stating that that 50% of their workforce was female. A final response hinted at barriers preventing female clinicians from taking on the extra, often unpaid work that tech startups require, but stopped short of stating something like: “..It’s because they look after children.”

…AI bots could potentially reinforce outdated stereotypes of subservient female secretaries, aligning with patriarchal priorities to optimise healthcare for the male population.

This experience prompted me to consider the impacts of medical misogyny in the AI field. Research shows some algorithms and AI models aren’t as effective at diagnosing women and minorities,² arguably due to models being mostly designed by men and inherent biases in training data.³ Even the naming of scribes gave me a deep sense of unease. Rather than advancing healthcare, these AI bots could potentially reinforce outdated stereotypes of subservient female secretaries, aligning with patriarchal priorities to optimise healthcare for the male population. The names of the electronic scribes conjure the ‘meme’ of an infamous novel (1972) and movie (1975, remade in 2004), The Stepford Wives, in which the men in fictional town of Stepford replace their spouses with more subservient android copies of them.⁴

Of course I wouldn’t suggest hiring women in medical AI purely based on gender. Nor would I suggest that we don’t embrace AI in medicine due to risk of bias. But this particular  elephant in the room needs addressing so that such biases can be mitigated.
Moreover, the discomfort I witnessed when raising this issue excites me, in a different way than I initially envisioned, about my project. As a newcomer to AI-assisted healthcare, I am well-positioned with this integrated training post to critically review the risks of AI potentially supercharging health inequalities. My relative lack of experience might be one of my biggest strengths. Especially in this era where diversity, equity and inclusion might be unpopular in tech and beyond,5 I have no shareholders or social media followers to appease. This gives me a unique opportunity to highlight gender and minority inequalities in the field, perhaps unmasking some ‘Stepford scribes’ in the process.
References
  1. McKinsey Digital. Women in tech: The best bet to solve Europe’s talent shortage [Internet]. McKinsey & Company; [cited 2025 Feb 28]. Available from: https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/women-in-tech-the-best-bet-to-solve-europes-talent-shortage
  2. University College London. Gender bias revealed in AI tools screening liver disease [Internet]. UCL News; 2022 Jul 11 [cited 2025 Feb 28]. Available from: https://www.ucl.ac.uk/news/2022/jul/gender-bias-revealed-ai-tools-screening-liver-disease
  3. Korutz J, Kim DW, Zeba A, Shur LA, Logghe H, Demetres M, et al. Gender representation in academic radiology artificial intelligence research: a systematic review. J Digit Imaging. 2023 Dec;36(6):2090-2096.
  4. https://en.wikipedia.org/wiki/The_Stepford_Wives [accessed 2/43/25]
  5. Hatmaker T. OpenAI scrubs diversity commitment web page from its site [Internet]. TechCrunch; 2025 Feb 13 [cited 2025 Feb 28]. Available from: https://techcrunch.com/2025/02/13/openai-scrubs-diversity-commitment-web-page-from-its-site/

Featured Photo by Cash Macanaya on Unsplash

Subscribe
Notify of
guest


This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Previous Story

Illusions of control

Next Story

Narrative failure

Latest from Bright Ideas and Innovation

0
Would love your thoughts, please comment.x
()
x
Skip to toolbar