Back to Search
Start Over
Evaluating accuracy and reproducibility of ChatGPT responses to patient-based questions in Ophthalmology: An observational study.
- Source :
-
Medicine [Medicine (Baltimore)] 2024 Aug 09; Vol. 103 (32), pp. e39120. - Publication Year :
- 2024
-
Abstract
- Chat Generative Pre-Trained Transformer (ChatGPT) is an online large language model that appears to be a popular source of health information, as it can provide patients with answers in the form of human-like text, although the accuracy and safety of its responses are not evident. This study aims to evaluate the accuracy and reproducibility of ChatGPT responses to patients-based questions in ophthalmology. We collected 150 questions from the "Ask an ophthalmologist" page of the American Academy of Ophthalmology, which were reviewed and refined by two ophthalmologists for their eligibility. Each question was inputted into ChatGPT twice using the "new chat" option. The grading scale included the following: (1) comprehensive, (2) correct but inadequate, (3) some correct and some incorrect, and (4) completely incorrect. Totally, 117 questions were inputted into ChatGPT, which provided "comprehensive" responses to 70/117 (59.8%) of questions. Concerning reproducibility, it was defined as no difference in grading categories (1 and 2 vs 3 and 4) between the 2 responses for each question. ChatGPT provided reproducible responses to 91.5% of questions. This study shows moderate accuracy and reproducibility of ChatGPT responses to patients' questions in ophthalmology. ChatGPT may be-after more modifications-a supplementary health information source, which should be used as an adjunct, but not a substitute, to medical advice. The reliability of ChatGPT should undergo more investigations.<br />Competing Interests: The authors have no funding and conflicts of interest to disclose.<br /> (Copyright © 2024 the Author(s). Published by Wolters Kluwer Health, Inc.)
- Subjects :
- Humans
Reproducibility of Results
Surveys and Questionnaires
Internet
Ophthalmology
Subjects
Details
- Language :
- English
- ISSN :
- 1536-5964
- Volume :
- 103
- Issue :
- 32
- Database :
- MEDLINE
- Journal :
- Medicine
- Publication Type :
- Academic Journal
- Accession number :
- 39121263
- Full Text :
- https://doi.org/10.1097/MD.0000000000039120