ChatGPT, a language model-based chatbot developed by OpenAI and partially owned by Microsoft, has been found to provide inaccurate cancer treatment recommendations, according to a recent study. The study, conducted by researchers at Boston’s Brigham and Women’s Hospital, focused on breast, prostate, and lung cancer cases and asked ChatGPT to provide treatment plans based on the severity of each case. While 98% of ChatGPT’s responses included at least one recommendation that aligned with the National Comprehensive Cancer Network (NCCN) guidelines, complete agreement with the guidelines only occurred 62% of the time. Additionally, in 12.5% of cases, ChatGPT offered recommendations that were not supported by the guidelines.
The study highlighted that ChatGPT was prone to mixing incorrect recommendations with the correct ones, making it challenging to detect errors even for experts. The findings raise concerns about the reliability and trustworthiness of AI-based healthcare tools. As more healthcare providers and patients rely on such chatbots for guidance, it is crucial to ensure their accuracy in providing evidence-based recommendations. The study calls for further improvements in the development and validation of AI models in healthcare to enhance patient safety and the overall quality of healthcare delivery.
This research underscores the need for continued evaluation and refinement of AI technologies in healthcare. While AI has shown great promise in various domains, including healthcare, these findings emphasize the importance of thorough validation and alignment with established guidelines such as NCCN. Further research and development are essential to enhance the capabilities and accuracy of AI-powered chatbots, ensuring that they can effectively support medical professionals and provide reliable recommendations for patients’ treatment plans. Collaborative efforts between AI developers, medical experts, and regulatory bodies are vital to address the limitations and improve the performance of AI in healthcare settings.