Nursing BlogsTech

Should Doctors Use Chat-GPT to Practice Medicine?


Professionals in virtually every industry have been buzzing about Chat-GPT ever since it debuted in November of last year. The AI-powered chatbot can generate all kinds of text from stories and poems to computer code. It uses language models to produce text that mimics the style of human beings. And now doctors are using it to answer routine clinical questions. But providers are still grappling with how or whether this technology should be used in healthcare.

Dereck Paul is the co-founder of Glass Health, a start-up in San Francisco that offers artificial intelligence chatbot services to doctors looking to ease the burden of routine processes, such as patient documentation. “We need these folks not in burnt-out states, trying to complete documentation,” Paul told NPR. “Patients need more than 10 minutes with their doctors.”

Paul wasn’t particularly impressed with the writing of Chat-GPT when it first arrived on the scene. But he soon started getting messages from doctors, including some of his firm’s customers, who were incorporating the chatbot into their medical practice. They found it to be useful when it comes to answering clinical questions.

Marc Succi, a doctor at Massachusetts General Hospital who has been evaluating the bot, says it can produce a diagnosis on the level of a third or fourth-year medical student when presented with hypothetical information about a patient. But the software can still make-up stories and fabricate sources for the sake of generating additional information, even if it’s not 100% accurate.

“I would express considerable caution using this in a clinical scenario for any reason, at the current stage,” Succi warned.

Paul and his team at Glass Health have created their own medical AI chatbot called Glass AI that uses the same language model as Chat-GPT but instead of basing its answers off text that’s already been posted on the internet, it uses a medical textbook written by humans as its source of main information.

“We’re working on doctors being able to put in a one-liner, a patient summary, and for us to be able to generate the first draft of a clinical plan for that doctor,” Paul explained. “So, what tests they would order and what treatments they would order.”

He said this technology will help doctors save time when entering information about a patient.

“The physician quality of life is really, really rough. The documentation burden is massive,” he commented. “Patients don’t feel like their doctors have enough time to spend with them.”

But not everyone in the industry is convinced. Marzyeh Ghassemi, a computer scientist studying AI in healthcare at MIT, says these algorithms draw on biased information that contribute to health inequalities.

“When you take state-of-the-art machine learning methods and systems and then evaluate them on different patient groups, they do not perform equally,” she says.

Each chatbot uses data and information created by human beings, all of which can contain biases against certain groups. For example, when her team asked a chatbot to automatically complete a sentence the results varied based on the patient’s race.

“When we said, ‘White or Caucasian patient was belligerent or violent,’ the model filled in the blank [with] ‘Patient was sent to hospital,'” she said. “If we said ‘Black, African American, or African patient was belligerent or violent,’ the model completed the note [with] ‘Patient was sent to prison.'”

She worries that these tools will only make existing inequalities worse even if they are helping doctors do more with their time. There’s no guarantee that providers will use that additional time to focus on their patients or fix issues with the generated text.

“I don’t know whether the tools that are being developed are being developed to reduce the burden on the doctor, or to really increase the throughput in the system,” Ghassemi added. How doctors intend to use these tools will have a huge effect on how the new technology affects patients.

The FDA is doing its best to regulate these new “medical devices” as they come onto the scene to make sure they meet clinical standards. But doctors are still experimenting with them in the meantime.

“The agency is working closely with stakeholders and following the science to make sure that Americans will benefit from new technologies as they further develop, while ensuring the safety and effectiveness of medical devices,” said FDA spokesperson Jim McKinney.


Woman Gets “Second Chance” After Doctors Remove 100-Lb. Ovarian Cyst

Previous article

Robotic Hysterectomy Surgery Hailed for Better Outcomes

Next article

You may also like