Abstract
OBJECTIVE: ChatGPT is a popular artificial intelligence (AI) tool used to answer questions on any subject. Given ChatGPT's popularity, it is prudent to investigate its ability to answer common patient questions in the field of hand therapy to better guide patients as they navigate the resources available to them. METHODS: This is a cross-sectional, rater-based comparison study. Four common hand therapy questions were entered into ChatGPT version 3.5. The first five answer tabs that appeared with a Google search for the same four questions were downloaded. Three certified hand therapists blindly graded ChatGPT and Google's answers using Likert scales to assess for answer accuracy (0-6), comprehensiveness (0-3), and conciseness (0-3). RESULTS: ChatGPT was significantly more accurate, with an estimated marginal mean (EMM) of 5.75 (95% CI: 4.96, 6.54) compared to Google's 3.48 (95% CI: 2.86, 4.10) (p < 0.001). ChatGPT was significantly more complete, with an EMM of 2.50 (95% CI: 2.10, 2.90) compared to Google's 1.48 (95% CI: 1.19, 1.77) (p < 0.001). ChatGPT was significantly more concise, with an EMM of 3.00 (95% CI: 2.66, 3.34) versus 1.60 (95% CI: 1.29, 1.91) for Google (p < 0.001). CONCLUSION: ChatGPT is a concise, comprehensive, and accurate alternative to a Google search for people seeking information on hand therapy. The free version of ChatGPT does not update its sourcing past 2019, and the software is known to occasionally present false information. Frequently updated academic websites should therefore remain the primary online medical resource for patients.