Considering ChatGPT for Medical Research? Proceed with Caution, DBMI's Brad Malin says

Interest in ChatGPT, the artificial intelligence-powered chatbot and text generation tool, is growing among physicians, with nearly 43 percent saying they would consider using it for medical research, according to a recent poll.

G-med, an online crowdsourcing platform for physicians, led the poll, and said that 42.69 percent responded in favor while 32 percent said they would not use it and around 25 percent were unsure. The poll had 424 respondents. 

"The poll was conducted in the context of a growing concern about the use of large language models in scientific writing, and the need for transparency and integrity in research methods," G-med wrote in a blog post. 

Transparency, integrity and ethics are at the heart of an ongoing debate across the scientific community around using the tool in research. 

Bradley Malin, PhD, a medical ethicist with Vanderbilt University Medical Center in Nashville, Tenn., who also leads an ethics group for the National Institutes of Health's Bridge2AI program, told Becker's what is important for physicians to keep in mind about ChatGPT for now is that it is a new tool and can be useful, but there is still much to learn about its accuracy and reasoning. 

"It should definitely be used with caution at the moment, in that it may assist and speed things up for the investigator who uses it to search with, but they're still going to have to go off and verify to see if the information has been provided to them is correct," Dr. Malin said. "That's the challenging part. I don't think anybody is ready to just completely trust any of the information that a privately owned AI system is going to provide to the end user at this time."

Read more in Becker's Hospital Review!