Albuquerque Journal

New AI tools helping doctors communicat­e with their patients

- BY CARLA K. JOHNSON

Don’t be surprised if your doctors start writing you overly friendly messages. They could be getting some help from artificial intelligen­ce.

New AI tools are helping doctors communicat­e with their patients, some by answering messages and others by taking notes during exams. It’s been 15 months since OpenAI released ChatGPT. Already thousands of doctors are using similar products based on large language models. One company says its tool works in 14 languages.

AI saves doctors time and prevents burnout, enthusiast­s say. It also shakes up the doctor-patient relationsh­ip, raising questions of trust, transparen­cy, privacy and the future of human connection.

A look at how new AI tools affect patients:

In recent years, medical devices with machine learning have been doing things like reading mammograms, diagnosing eye disease and detecting heart problems. What’s new is generative AI’s ability to respond to complex instructio­ns by predicting language.

Your next check-up could be recorded by an AI-powered smartphone app that listens, documents and instantly organizes everything into a note you can read later. The tool also can mean more money for the doctor’s employer because it won’t forget details that legitimate­ly could be billed to insurance.

Your doctor should ask for your consent before using the tool. You might also see some new wording in the forms you sign at the doctor’s office.

Other AI tools could be helping your doctor draft a message.

Doctors or nurses must approve the AI-generated messages before sending them.

Will AI make mistakes?

Large language models can misinterpr­et input or even fabricate inaccurate responses, an effect called hallucinat­ion. The new tools have internal guardrails to try to prevent inaccuraci­es from reaching patients — or landing in electronic health records.

“You don’t want those fake things entering the clinical notes,” said Dr. Alistair Erskine, who leads digital innovation­s for Georgia-based Emory Healthcare, where hundreds of doctors are using a product from Abridge to document patient visits.

The tool runs the doctor-patient conversati­on across several large language models and eliminates weird ideas, Erskine said. “It’s a way of engineerin­g out hallucinat­ions.”

As doctors review AI-generated notes, they can click on any word and listen to the specific segment of the patient’s visit to check accuracy.

In Buffalo, New York, a different AI tool misheard Dr. Lauren Bruckner when she told a teenage cancer patient it was a good thing she didn’t have an allergy to sulfa drugs. The AI-generated note said, “Allergies: Sulfa.”

The tool “totally misunderst­ood the conversati­on,” said Bruckner, chief medical informatio­n officer at Roswell Park Comprehens­ive Cancer Center. “That doesn’t happen often, but clearly that’s a problem.”

What about the human touch?

AI tools can be prompted to be friendly, empathetic and informativ­e. But they can get carried away.

“At times, it’s an astounding help and at times it’s of no help at all,” said Dr. C.T. Lin, who leads technology innovation­s at Colorado-based UC Health, where about 250 doctors and staff use a Microsoft AI tool to write the first draft of messages to patients. The messages are delivered through Epic’s patient portal.

The tool had to be taught about a new RSV vaccine because it was drafting messages saying there was no such thing. But with routine advice — like rest, ice, compressio­n and elevation for an ankle sprain — “it’s beautiful for that,” Linn said.

Also on the plus side, doctors using AI are no longer tied to their computers during medical appointmen­ts. They can make eye contact with their patients because the AI tool records the exam.

What about privacy?

U.S. law requires health care systems to get assurances from business associates that they will safeguard protected health informatio­n, and the companies could face investigat­ion and fines from the Department of Health and Human Services if they mess up.

Doctors interviewe­d for this article said they feel confident in the data security of the new products and that the informatio­n will not be sold.

 ?? RYAN PRINS/ASSOCIATED PRESS ?? In this photo provided by the University of Michigan Health-West, Dr. Lance Owens, chief medical informatio­n officer at the university, demonstrat­es the use of an AI tool on a smartphone. The software listens to a doctor-patient conversati­on, then documents and organizes it to write a clinical note.
RYAN PRINS/ASSOCIATED PRESS In this photo provided by the University of Michigan Health-West, Dr. Lance Owens, chief medical informatio­n officer at the university, demonstrat­es the use of an AI tool on a smartphone. The software listens to a doctor-patient conversati­on, then documents and organizes it to write a clinical note.

Newspapers in English

Newspapers from United States