Tuesday, 02 January 2024 12:17 GMT

Study Reveals AI Chatbots Pose Risks When Used for Medical Advice


(MENAFN) Using artificial intelligence (AI) chatbots to obtain medical guidance can be “dangerous,” a recent study published in the journal Nature Medicine has found, according to reports from media outlets on Tuesday.

The investigation, conducted by the Oxford Internet Institute along with the Nuffield Department of Primary Care Health Sciences at the University of Oxford, revealed that depending on AI for medical decisions carries risks because of a “tendency to provide inaccurate and inconsistent information.”

Rebecca Payne, a co-author of the study and a practicing general practitioner, stated: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”

She emphasized that “patients need to be aware that asking a large language model about their symptoms can be dangerous, giving wrong diagnoses and failing to recognize when urgent help is needed.”

In the research, nearly 1,300 participants were asked to determine potential health conditions and suggest next steps based on various scenarios. Some used large language model platforms to obtain possible diagnoses, while others relied on conventional methods such as consulting a GP.

The study found that AI tools frequently produced a “mix of good and bad information,” which participants found difficult to differentiate.

Although the chatbots “excel at standardized tests of medical knowledge,” the researchers concluded that their practical application as medical assistants “would pose risks to real users seeking help with their own medical symptoms.”

MENAFN12022026000045017167ID1110731021



MENAFN

Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

Search