A survey of 1,000 GPs published in a BMJ journal has revealed that 20% of them use artificial intelligence (AI) tools in day-to-day clinical practice.
Some 29% of those who responded in detail use AI to generate documents after appointments and 28% use AI to suggest differential diagnosis.
Authors cautioned GP against ‘subtle errors and biases’ and pointed out that they may risk harm to patients and privacy since it is not clear how companies behind AI tools use the data they receive.
Dr Ellie Mein, medico-legal adviser at the Medical Defence Union, warned the Guardian that the use of AI by GPs could raise issues including inaccuracy and patient confidentiality.
“Along with the uses identified in the BMJ paper, we’ve found that some doctors are turning to AI programs to help draft complaint responses for them,” she said. “We have cautioned MDU members about the issues this raises, including inaccuracy and patient confidentiality. There are also data protection considerations.
“When dealing with patient complaints, AI drafted responses may sound plausible but can contain inaccuracies and reference incorrect guidelines which can be hard to spot when woven into very eloquent passages of text.”
GPs have begun to adopt clinical tools such as Heidi and OSLER by Tortus to support sessional work, as well as using the generic tool ChatGPT.
Dr Richard Fieldhouse, NASGP chair, said: “Revolutions in industry invariably have early adopters, as well as detractors, along with their respective benefits and side effects. So it’s prudent for the MDU to raise these concerns and advise caution. But now, it falls upon us, both as individuals and as a profession, to tread carefully – avoiding patient harm while also ensuring we harness the potential benefits that generative AI can offer.
“Generative AI could significantly improve continuity of care. For instance, confirming the doubling of Ms McKeown’s bisoprolol dosage, as noted in one of her 273 consultant letters, could be accomplished in seconds by AI, rather than the 10 minutes it might take a frazzled GP at the end of a long day.
“As GPs begin to adopt these tools, medical leadership and trainers must develop ways to educate the profession on their safe and ethical use.”