Abstract
Eye-tracking studies in virtual reality (VR) deliver insights into behavioral function. The gold standard of evaluating gaze behavior is based on manual scoring, which is labor-intensive. Previously proposed automated eye-tracking algorithms for VR head mount display (HMD) were not validated against manual scoring, or tested in dynamic areas of interest (AOIs). Our study validates the accuracy of an automated scoring algorithm, which determines temporal fixation behavior on static and dynamic AOIs in VR, against subjective human annotation. The interclass-correlation coefficient (ICC) was calculated for the time of first fixation (TOFF) and total fixation duration (TFD), in ten participants, each presented with 36 static and dynamic AOIs. High ICC values (≥0.982; p < 0.0001) were obtained when comparing the algorithm-generated TOFF and TFD to the raters' annotations. In sum, our algorithm is accurate in determining temporal parameters related to gaze behavior when using HMD-based VR. Thus, the significant time required for human scoring among numerous raters can be rendered obsolete with a reliable automated scoring system. The algorithm proposed here was designed to sub-serve a separate study that uses TOFF and TFD to differentiate apathy from depression in those suffering from Alzheimer's dementia.