As AI technologies become increasingly applicable and available in healthcare settings, their implementation will need to be guided and supervised by human beings – for both technical and social reasons. This is the conclusion at the centre of a new study by the PCHSS’s Enrico Coiera and colleagues at the Australian Institute of Health Innovation.
Published in the Journal of American Medical Informatics Association, “Envisioning an artificial intelligence documentation assistant for future primary care consultations: A co-design study with general practitioners” focuses on the perspectives of doctors on AI in primary care settings. Also known as digital scribes, AI documentation assistants are designed to automatically generate medical notes based on patient-clinician conversations. They are intended to reduce the documentation burden on doctors, thereby helping prevent clinician burnout as well as potential loss of information.
The study was based on three co-design workshops carried out with 16 general practitioners (GPs). During the sessions, three main themes emerged: professional autonomy, human-AI collaboration, and new models of care. Regarding professional autonomy, GP participants raised concerns about the potential medico-legal issues of caused by the retrospective assessment of full consultation records and also touched on the need to personalise AI systems to match doctors’ unique working styles. On this latter subject, one participant noted:
We need to develop machine learning that has the capability … to accommodate professional preferences and styles … It’s adaptive and it supports you based on your particular consultation and decision-making processes.
In relation to human-AI collaboration, the participants spoke of welcoming the assistance AI technologies could provide in terms of decision support and reducing repetitive tasks. However, they also raised concerns about constant auditing (assessment the quality and performance of GPs’ work) and noted how it would be difficult for machines to provide the empathy and human communication that doctors offer. On this note, one participant stated:
I think the point of the doctor is to give suffering meaning. It’s to provide a steady hand, I think. It’s to support people through issues … a computer can support you, but it doesn’t have any meaning because there’s no emotional risk from the computer because the computer does not know what it means to live or die.
In connection with AI’s role in new models of care, the participants spoke of its potential assistance in the pre-consultation phase, in telehealth and through mobile apps, the latter of which can support with collecting patient-generated data and seamlessly integrating it into electronic health records. On this point, one participant commented: “I quite like the idea of having an app that’s safe, that patients can update their health information that can link to the medical records.”
AI documentation assistants will no doubt become more and more integrated into primary care consultations in the years ahead. In view of the wide-ranging reflections of the participating GPs, the study’s authors conclude that this process will require human input and supervision, especially until strong evidence emerges for reliable autonomous performance by AI technologies. Human-AI collaboration models will therefore need to be developed “to ensure patient safety, quality of care, doctor safety, and doctor autonomy.”