Back to Search Start Over

Appropriateness of Frequently Asked Patient Questions Following Total Hip Arthroplasty From ChatGPT Compared to Arthroplasty-Trained Nurses.

Authors :
Dubin JA
Bains SS
DeRogatis MJ
Moore MC
Hameed D
Mont MA
Nace J
Delanois RE
Source :
The Journal of arthroplasty [J Arthroplasty] 2024 Sep; Vol. 39 (9S1), pp. S306-S311. Date of Electronic Publication: 2024 Apr 16.
Publication Year :
2024

Abstract

Background: The use of ChatGPT (Generative Pretrained Transformer), which is a natural language artificial intelligence model, has gained unparalleled attention with the accumulation of over 100 million users within months of launching. As such, we aimed to compare the following: 1) orthopaedic surgeons' evaluation of the appropriateness of the answers to the most frequently asked patient questions after total hip arthroplasty; and 2) patients' evaluation of ChatGPT and arthroplasty-trained nurses responses to answer their postoperative questions.<br />Methods: We prospectively created 60 questions to address the most commonly asked patient questions following total hip arthroplasty. We obtained answers from arthroplasty-trained nurses and from the ChatGPT-3.5 version for each of the questions. Surgeons graded each set of responses based on clinical judgment as 1) "appropriate," 2) "inappropriate" if the response contained inappropriate information, or 3) "unreliable" if the responses provided inconsistent content. Each patient was given a randomly selected question from the 60 aforementioned questions, with responses provided by ChatGPT and arthroplasty-trained nurses, using a Research Electronic Data Capture survey hosted at our local hospital.<br />Results: The 3 fellowship-trained surgeons graded 56 out of 60 (93.3%) responses for the arthroplasty-trained nurses and 57 out of 60 (95.0%) for ChatGPT to be "appropriate." There were 175 out of 252 (69.4%) patients who were more comfortable following the ChatGPT responses and 77 out of 252 (30.6%) who preferred arthroplasty-trained nurses' responses. However, 199 out of 252 patients (79.0%) responded that they were "uncertain" with regard to trusting AI to answer their postoperative questions.<br />Conclusions: ChatGPT provided appropriate answers from a physician perspective. Patients were also more comfortable with the ChatGPT responses than those from arthroplasty-trained nurses. Inevitably, its successful implementation is dependent on its ability to provide credible information that is consistent with the goals of the physician and patient alike.<br /> (Copyright © 2024 Elsevier Inc. All rights reserved.)

Details

Language :
English
ISSN :
1532-8406
Volume :
39
Issue :
9S1
Database :
MEDLINE
Journal :
The Journal of arthroplasty
Publication Type :
Academic Journal
Accession number :
38626863
Full Text :
https://doi.org/10.1016/j.arth.2024.04.020