Dutch hospital info chief as medical chatbot is rolled out: Let's not regulate AI to death

The University Medical Centre Groningen (UMCG) uses an AI chatbot to help answer the hundreds of questions it receives from patients weekly to help ease the workload of already overstretched healthcare providers.

In early December, EU policymakers reached a political agreement on the AI Act, which is set to become a global benchmark for regulating this increasingly popular technology. Shortly before, the UMCG hospital in the north of the Netherlands began using AI to help answer the hundreds of questions it receives from patients.

The Chief Medical Information Officer at UMCG, Dr Tom van der Laan, urged the EU not to over-regulate artificial intelligence to allow the technology to ease the region’s healthcare provision constraints during an interview with Euractiv.

“Let’s not regulate this to death. It might be our only chance to have some level of healthcare shortly for older people instead of being constrained and of a lower quality than what we’re used to,” van der Laan said.

“Not being able to use this technology is going to have graver consequences than using it and maybe exceeding the risk profile a little bit,” he added.

Dr Chatbot

The medical officer is spearheading the UMCG’s use of AI, and it is the first in Europe to use the technology to draft answers to patient emails that a healthcare professional then checks before being sent.

Every week, the UMCG receives more than 1,200 written questions about various topics, including medication use and pain management, increasing the administrative burden on doctors and other healthcare professionals.

While van der Laan praised AI’s capabilities, he said that healthcare remains human work.

“Artificial intelligence can support and make work easier, but healthcare professionals are irreplaceable in healthcare for the time being,” he said.

Nonetheless, he said that AI would change the field of medicine and that it came at a reasonable time with ageing populations and fewer people with the skills to care for them.

Μore time with patients

According to van der Laan, healthcare providers spending less time on administrative tasks means they have more time with patients. Furthermore, healthcare providers may have to be brief in their replies when answering questions via email because of their immense workload.

The AI may be able to tweak the replies to make them sound more empathetic – the AI wished one patient a happy holiday at the end of one email, a line that may have gone unsent by a physician in a hurry.

Van der Laan also uses AI to help with patient rounds and summarising his patients’ medication changes.

“It’s like asking a human language question to another physician. It will come up with an answer,” said van der Laan.

The US company Epic developed the AI application, which also supplies the hospital’s electronic patient file (EPD) software and has a strategic partnership with Microsoft.

A spokesperson for Epic told Euractiv that AI data processing for European customers occurs in a secure environment in Europe.

“Healthcare organisations in the EU and around the world use our AI tools to increase efficiency, make clinicians’ working lives easier and more enjoyable, and improve the patient experience,” they said.

The AI used during the trial was not a self-learning system, so the chatbot did not learn from the patient data. It was integrated within the EPD to keep patient data secure, rendering it inaccessible to the supplier. It uses OpenAI’s GPT-4 model, but they can switch at any time if, for example, a specific medical model emerges in the future.

When in doubt, the AI refers patients to humans

In Sweden, an investigation was launched after a triage chatbot did not correctly prioritise one in five patients, as Euractiv reported.

In the Netherlands, van der Laan said that rather than giving inaccurate information, their model sometimes said it lacked the necessary information to provide an accurate answer, and in those instances, it recommended that a patient contact a (human) healthcare professional.

“The main limitation is that we’re instructing it not to give medical advice for it not to cross the line and become a medical device,” he said.

In the coming weeks, more Dutch hospitals with an EPD from the same supplier will use this application in collaboration with other hospitals from the EPIC Dutch Association. According to van der Laan, its use in various American hospitals has been very positive.

[By Christoph Schwaiger, Edited by Vasiliki Angouridi | Euractiv.com]

Read more with Euractiv