GPT-4 as a Source of Patient Information for Anterior Cervical Discectomy and Fusion: A Comparative Analysis Against Google Web Search
Average rating
Cast your vote
You can rate an item by clicking the amount of stars they wish to award to this item.
When enough users have cast their vote on this item, the average rating will also be shown.
Star rating
Your vote was cast
Thank you for your feedback
Thank you for your feedback
Author
Mastrokostas, Paul G.Mastrokostas, Leonidas E.
Emara, Ahmed K.
Wellington, Ian J.
Ginalis, Elizabeth
Houten, John K.
Khalsa, Amrit S.
Saleh, Ahmed
Razi, Afshin E.
Ng, Mitchell K.
Keyword
GPT-4anterior cervical discectomy and fusion
artificial intelligence
health literacy
patient education
readability
Journal title
Global Spine JournalDate Published
2024-03-21
Metadata
Show full item recordAbstract
Study design: Comparative study. Objectives: This study aims to compare Google and GPT-4 in terms of (1) question types, (2) response readability, (3) source quality, and (4) numerical response accuracy for the top 10 most frequently asked questions (FAQs) about anterior cervical discectomy and fusion (ACDF). Methods: "Anterior cervical discectomy and fusion" was searched on Google and GPT-4 on December 18, 2023. Top 10 FAQs were classified according to the Rothwell system. Source quality was evaluated using JAMA benchmark criteria and readability was assessed using Flesch Reading Ease and Flesch-Kincaid grade level. Differences in JAMA scores, Flesch-Kincaid grade level, Flesch Reading Ease, and word count between platforms were analyzed using Student's t-tests. Statistical significance was set at the .05 level. Results: Frequently asked questions from Google were varied, while GPT-4 focused on technical details and indications/management. GPT-4 showed a higher Flesch-Kincaid grade level (12.96 vs 9.28, P = .003), lower Flesch Reading Ease score (37.07 vs 54.85, P = .005), and higher JAMA scores for source quality (3.333 vs 1.800, P = .016). Numerically, 6 out of 10 responses varied between platforms, with GPT-4 providing broader recovery timelines for ACDF. Conclusions: This study demonstrates GPT-4's ability to elevate patient education by providing high-quality, diverse information tailored to those with advanced literacy levels. As AI technology evolves, refining these tools for accuracy and user-friendliness remains crucial, catering to patients' varying literacy levels and information needs in spine surgery.Citation
Mastrokostas PG, Mastrokostas LE, Emara AK, Wellington IJ, Ginalis E, Houten JK, Khalsa AS, Saleh A, Razi AE, Ng MK. GPT-4 as a Source of Patient Information for Anterior Cervical Discectomy and Fusion: A Comparative Analysis Against Google Web Search. Global Spine J. 2024 Mar 21:21925682241241241. doi: 10.1177/21925682241241241. Epub ahead of print. PMID: 38513636.DOI
10.1177/21925682241241241ae974a485f413a2113503eed53cd6c53
10.1177/21925682241241241
Scopus Count
Collections
Except where otherwise noted, this item's license is described as https://creativecommons.org/licenses/by-nc-nd/4.0/
Related articles
- ChatGPT as a Source of Patient Information for Lumbar Spinal Fusion and Laminectomy: A Comparative Analysis Against Google Web Search.
- Authors: Nian PP, Saleet J, Magruder M, Wellington IJ, Choueka J, Houten JK, Saleh A, Razi AE, Ng MK
- Issue date: 2024 Feb 20
- Ankle conFUSION: The quality and readability of information on the internet relating to ankle arthrodesis.
- Authors: Irwin SC, Lennon DT, Stanley CP, Sheridan GA, Walsh JC
- Issue date: 2021 Dec
- The Readability and Quality of Web-Based Patient Information on Nasopharyngeal Carcinoma: Quantitative Content Analysis.
- Authors: Tan DJY, Ko TK, Fan KS
- Issue date: 2023 Nov 27
- Readability Levels of Dental Patient Education Brochures.
- Authors: Boles CD, Liu Y, November-Rider D
- Issue date: 2016 Feb
- Can Artificial Intelligence Improve the Readability of Patient Education Materials?
- Authors: Kirchner GJ, Kim RY, Weddle JB, Bible JE
- Issue date: 2023 Nov 1