Abstract
We present a novel computer vision approach using DeepLabCut to objectively quantify orofacial dyskinesias in Parkinson's disease. Unlike traditional wearable sensors that focus on limb movements, this method tracks tongue, chin, nose, and forehead landmarks through standard video recordings using a fully markerless deep learning pipeline, capturing metrics including displacement, variability, and peak movements. Analysis of a hospitalised patient over 4 days demonstrated progressive reduction in dyskinetic parameters correlating with medication adjustments, consistent with concurrent clinical assessment using the Unified Dyskinesia Rating Scale (UDysRS) and modified Abnormal Involuntary Movement Scale (mAIMS). Though resource-intensive, glossography offers potential for remote monitoring in underserved areas with limited specialist access. The technique provides granular movement assessment using widely available technology, potentially enhancing treatment precision beyond traditional clinician-administered rating scales.