Enhancing Autism Detection Through Gaze Analysis Using Eye Tracking Sensors and Data Attribution with Distillation in Deep Neural Networks

利用眼动追踪传感器和深度神经网络数据蒸馏进行注视分析以增强自闭症检测

阅读:1

Abstract

Autism Spectrum Disorder (ASD) is a neurodevelopmental condition characterized by differences in social communication and repetitive behaviors, often associated with atypical visual attention patterns. In this paper, the Gaze-Based Autism Classifier (GBAC) is proposed, which is a Deep Neural Network model that leverages both data distillation and data attribution techniques to enhance ASD classification accuracy and explainability. Using data sampled by eye tracking sensors, the model identifies unique gaze behaviors linked to ASD and applies an explainability technique called TracIn for data attribution by computing self-influence scores to filter out noisy or anomalous training samples. This refinement process significantly improves both accuracy and computational efficiency, achieving a test accuracy of 94.35% while using only 77% of the dataset, showing that the proposed GBAC outperforms the same model trained on the full dataset and random sample reductions, as well as the benchmarks. Additionally, the data attribution analysis provides insights into the most influential training examples, offering a deeper understanding of how gaze patterns correlate with ASD-specific characteristics. These results underscore the potential of integrating explainable artificial intelligence into neurodevelopmental disorder diagnostics, advancing clinical research by providing deeper insights into the visual attention patterns associated with ASD.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。