A presentation at the Society for Pediatric Dermatology Annual Conference discussed how to use generative artificial intelligence (AI) to improve processes while working in the dermatology space.
The Society for Pediatric Dermatology Annual Meeting, held in Toronto, Ontario, Canada from July 11 to July 14, 2024, featured a session focused on numerous ways that ChatGPT and other generative artificial intelligence (AI) systems can help dermatologists conduct their practices. Albert C. Yan, MD, FAAP, FAAD, a pediatric dermatologist at the Children’s Hospital of Philadelphia, gave a presentation to demonstrate how doctors can use generative pre-trained transformers (GPTs) to streamline processes, as well as words of caution and best practices for using AI in practice.
GPTs, Yan explained, are AI systems that generate content, which includes text, images, audio, and video, that are based on multimodal input training. These include ChatGPT and Google’s Gemini, among other GPTs. The prompt bar is the method in which individuals can interact with these GPTs, and there’s a method to getting what you want out of the GPT.
“One of these [high-yield tips] is not just telling the AI—ChatGPT or Gemini—what you want, but also telling it what you don’t want,” Yan said. These AI generators can come up with very long answers to more general questions, so making sure to specify what you want out of the generative AI is important to ensure that you get the information needed. For example, Yan said, if you want information about pustular eruptions, you can specify those in the neonatal period that are common. Asking the AI for a step-by-step process can also help in improving accuracy, and adding “be concise” to your prompt can help get a shorter answer that’s far more concise.
In health care, Yan said that AI scribes are a new innovation that can be used by doctors. Scribes traditionally are an assistant to the doctor and help them to document patient notes in their electronic health record, but the process can be simplified using these new tools.
“There was this 1 study done…that looked at ambient scribes for clinicians,” Yan said. “And in essence, it saved the users about an hour to an hour and a half per day of scribe typing time with their electronic medical record. The more you use it, the better it helps you reduce your time in the charts.”
The study1 that monitored this use of AI found that most of the doctors rated the AI highly and fewer than 5% of doctors needed to edit the notes. Patients also felt that their doctor was able to pay more attention to them with an AI scribe. The limitations include the fact that only English and Spanish AI generators are available at the moment. AI hallucinations are also possible.
“AI scribes sometimes give you faulty information,” Yan noted. “For instance, in 1 example, a physician said that they were going to schedule a prostate exam but the scribe said it was done.”
Letters of medical necessity could also be drafted using the AI generative models when needed. The generative AI can also provide references to help the patient research their diagnosis. Yan said that this can save many doctors time but can be prone to major errors when it comes to providing these references. Yan gave an example of asking a generative AI for references and receiving names of studies that he’d never seen that were supposedly written by researchers in the space. However, when double-checking accuracy, he couldn’t find the studies the AI had procured.
“Then I went back to the AI…[and] I said, ‘Are these real published articles or are they hallucinated?’ And then it comes clean,” he said while laughing with the audience. “It said, ‘I apologize for any confusion. I made these up.’”
Yan emphasized that generative AIs are usually good at fact-checking, so asking if the work is real should give you a concrete answer. Avoiding these hallucinations is as simple as using detailed prompting and verifying the response.
Other ways to use generative AIs include dropping PDFs into the generative AI to summarize articles or using it to simulate a patient that you can have a conversation with. AI has also been effectively used in diagnosing and monitoring skin diseases. Yan noted that in other studies, when comparing with a radiologist, if the radiologist and the AI came to different conclusions but were confident, the AI was usually right, whereas if both were uncertain, the radiologist was usually right.
“It’s one of the things where we have to realize that sometimes humans are better at handling uncertainty than our AI systems, which are trained, as well, on uncertainty,” said Yan.
Yan concluded by reiterating some best practices for generative AIs, which include using AI to brainstorm diagnoses, providing information while maintaining patient privacy, fact-checking responses, and staying updated on legal and institutional policies around the use of AI tools. Yan said that the big things to avoid are relying on ChatGPT to give you the best diagnosis, succumbing to anchoring bias with the responses, accepting all responses as medical facts, and entering patient information that is protected by HIPAA, as these generative AIs are not HIPAA compliant.
“So, can ChatGPT do my job? Not completely. But if you learn to use AI, it may help you do your job even better,” Yan concluded.
Reference