Abstract
Early detection of arthritis in autoimmune rheumatic diseases (ARDs) is critical to prevent irreversible damage. Joint ultrasound (US) offers high sensitivity and availability in a routine clinical practice. However, US is limited by examiner dependency and resource requirements. Infrared thermography (IRT) is a non-invasive, radiation-free method to examine surface temperature alterations linked to arthritis. Although promising, its diagnostic performance relative to joint US remains incompletely defined. The aim of this review was to examine the literature on IRT and its relationship to joint US. We conducted a systematic review of PubMed, Web of Science, Directory of Open Access Journals (DOAJ) and Cochrane Central Register of Controlled Trials (CENTRAL) for studies published between January 2000 and December 2025. Studies were included if they assessed arthritic conditions using both IRT and US. Data on patient cohorts, assessment methods, findings, and diagnostic accuracy were extracted. Of 945 records, 19 studies met the inclusion criteria, primarily in rheumatoid arthritis (n = 13) and mixed populations (n = 4). IRT consistently differentiated inflamed from healthy joints. Sensitivity detecting arthritis ranged from 79 to 100%, specificity from 51 to 94%. Methods varied from basic temperature measures to advanced approaches, including algorithmic segmentation and machine learning-based scores such as ThermoJIS and ThermoDAI. Despite methodological differences, IRT demonstrated reproducible results and was particularly effective in detecting subclinical synovitis. IRT shows strong potential as a complementary, examiner-independent tool for detecting and monitoring joint inflammation in ARDs. Its consistent correlation with US suggests that IRT could serve as a useful adjunct in rheumatologic diagnostics. Future studies with standardized protocols are needed to establish its clinical utility.