Abstract
BACKGROUND: ChatGPT powered by OpenAI is a large language model that offers a potential method for patient education. Whether patients with knee osteoarthritis (KOA) can benefit from patient education via ChatGPT has not been sufficiently investigated. METHODS: We enrolled 60 participants enrolled from 1 January 2024 to 1 September 2024, who had clinically diagnosed KOA for the first time. Participants were excluded from analyses if they had post-traumatic osteoarthritis and a history of knee surgery. Participants received physician education ( n = 18), free education with ChatGPT ( n = 21), or supervised education with ChatGPT ( n = 21) with a pre-defined outline (five questions for reference). The primary outcome was the physician-rated patient knowledge level on KOA measured by a visual analogue scale (VAS, 0-100 mm). We also evaluated all answers from ChatGPT via VAS rating. RESULTS: Patients receiving free education with ChatGPT asked substantially more questions compared with those patients who were given a structured question outline (17.0 ± 9.3 versus 10.3 ± 7.6, P < 0.001). With the outline given to patients, ChatGPT responses in the supervised education group gave higher-quality answers compared with the answers from the group with free education (92.1 ± 4.3 versus 81.4 ± 10.4, P = 0.001). Finally, the supervised education with ChatGPT group achieved similar education effect (knowledge level, 95.3 ± 4.7) compared with the physician education group (95.6 ± 5.3), while the free education with ChatGPT group had a substantially lower knowledge level (82.1 ± 12.3, P < 0.001). CONCLUSION: Supervised education with ChatGPT using structured questions achieved comparable patient education outcomes to physician education in individuals with KOA. In contrast, free education with ChatGPT resulted in relatively lower knowledge levels and reduced answer quality, highlighting the need for caution in unsupervised artificial intelligence (AI) use. This study provides preliminary real-world evidence supporting the responsible use of AI tools like ChatGPT in patient education, particularly when guided by a pre-defined question outline.