Abstract
BACKGROUND: Statistical software plays a central role in quantitative dental research, yet reporting practices remain inconsistent and often incomplete. Missing information on software names, versions, and analytical components limits methodological transparency and reduces reproducibility. This study systematically evaluated the current status of statistical software reporting in dental research publications. METHODS: A methodological review was conducted following relevant elements of PRISMA 2020. Articles published in 2024 in the five highest-ranked and five lowest-ranked dentistry journals listed in the SJR Scimago database were screened. Eligible studies included empirical quantitative research and meta-analyses. Data extraction captured the software name, version, publisher, R packages, and the consistency of software reporting between the methods and results sections. Descriptive statistics and chi-square tests were used to summarise reporting patterns and assess differences across journal ranking groups. RESULTS: Of the 1,014 screened articles, 808 met the inclusion criteria. Statistical software was reported in 84.5% of studies, while 15.5% presented quantitative analyses without identifying any software. Among studies that reported software, 77.8% provided only the software name, and 6.6% reported complete details including version and publisher. SPSS was the most frequently reported software (45.2%). Reporting completeness did not differ significantly between high- and low-ranked journals (p = 0.13). CONCLUSION: Statistical software reporting in dental research is often incomplete, limiting transparency and reproducibility. Adoption of clearer and more standardised reporting guidelines may help improve methodological clarity across the field.