Skip to main content

Comparing the Ability of Google and ChatGPT to Accurately Respond to Oculoplastics-Related Patient Questions and Generate Customized Oculoplastics Patient Education Materials.

Publication ,  Journal Article
Cohen, SA; Yadlapalli, N; Tijerina, JD; Alabiad, CR; Chang, JR; Kinde, B; Mahoney, NR; Roelofs, KA; Woodward, JA; Kossler, AL
Published in: Clin Ophthalmol
2024

PURPOSE: To compare the accuracy and readability of responses to oculoplastics patient questions provided by Google and ChatGPT. Additionally, to assess the ability of ChatGPT to create customized patient education materials. METHODS: We executed a Google search to identify the 3 most frequently asked patient questions (FAQs) related to 10 oculoplastics conditions. FAQs were entered into both the Google search engine and the ChatGPT tool and responses were recorded. Responses were graded for readability using five validated readability indices and for accuracy by six oculoplastics surgeons. ChatGPT was instructed to create patient education materials at various reading levels for 8 oculoplastics procedures. The accuracy and readability of ChatGPT-generated procedural explanations were assessed. RESULTS: ChatGPT responses to patient FAQs were written at a significantly higher average grade level than Google responses (grade 15.6 vs 10.0, p < 0.001). ChatGPT responses (93% accuracy) were significantly more accurate (p < 0.001) than Google responses (78% accuracy) and were preferred by expert panelists (79%). ChatGPT accurately explained oculoplastics procedures at an above average reading level. When instructed to rewrite patient education materials at a lower reading level, grade level was reduced by approximately 4 (15.7 vs 11.7, respectively, p < 0.001) without sacrificing accuracy. CONCLUSION: ChatGPT has the potential to provide patients with accurate information regarding their oculoplastics conditions. ChatGPT may also be utilized by oculoplastic surgeons as an accurate tool to provide customizable patient education for patients with varying health literacy. A better understanding of oculoplastics conditions and procedures amongst patients can lead to informed eye care decisions.

Duke Scholars

Published In

Clin Ophthalmol

DOI

ISSN

1177-5467

Publication Date

2024

Volume

18

Start / End Page

2647 / 2655

Location

New Zealand

Related Subject Headings

  • 3212 Ophthalmology and optometry
  • 1113 Opthalmology and Optometry
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Cohen, S. A., Yadlapalli, N., Tijerina, J. D., Alabiad, C. R., Chang, J. R., Kinde, B., … Kossler, A. L. (2024). Comparing the Ability of Google and ChatGPT to Accurately Respond to Oculoplastics-Related Patient Questions and Generate Customized Oculoplastics Patient Education Materials. Clin Ophthalmol, 18, 2647–2655. https://doi.org/10.2147/OPTH.S480222
Cohen, Samuel A., Nikhita Yadlapalli, Jonathan D. Tijerina, Chrisfouad R. Alabiad, Jessica R. Chang, Benyam Kinde, Nicholas R. Mahoney, Kelsey A. Roelofs, Julie A. Woodward, and Andrea L. Kossler. “Comparing the Ability of Google and ChatGPT to Accurately Respond to Oculoplastics-Related Patient Questions and Generate Customized Oculoplastics Patient Education Materials.Clin Ophthalmol 18 (2024): 2647–55. https://doi.org/10.2147/OPTH.S480222.
Cohen SA, Yadlapalli N, Tijerina JD, Alabiad CR, Chang JR, Kinde B, Mahoney NR, Roelofs KA, Woodward JA, Kossler AL. Comparing the Ability of Google and ChatGPT to Accurately Respond to Oculoplastics-Related Patient Questions and Generate Customized Oculoplastics Patient Education Materials. Clin Ophthalmol. 2024;18:2647–2655.

Published In

Clin Ophthalmol

DOI

ISSN

1177-5467

Publication Date

2024

Volume

18

Start / End Page

2647 / 2655

Location

New Zealand

Related Subject Headings

  • 3212 Ophthalmology and optometry
  • 1113 Opthalmology and Optometry