Unsupervised Dynamic Time Warping Clustering for Robust Functional Network Identification in fNIRS Motor Tasks

无监督动态时间规整聚类用于fNIRS运动任务中稳健的功能网络识别

阅读:2

Abstract

Functional near-infrared spectroscopy (fNIRS) is a valuable non-invasive modality for brain-computer interfaces (BCIs), but robust signal interpretation is challenged by the significant temporal variability of the hemodynamic response. Standard linear methods, such as Pearson correlation, often fail to capture functional connectivity when signals exhibit temporal jitter. This study validates an unsupervised Dynamic Time Warping (DTW) clustering framework to robustly identify motor networks from fNIRS data by accommodating non-linear temporal shifts. We analyzed a public fNIRS dataset (N = 30) across right-hand (RHT), left-hand (LHT), and foot tapping (FT) tasks. A robust preprocessing pipeline was implemented, including Wavelet Motion Correction and Common Average Referencing (CAR) to remove artifacts and global systemic noise. The core method involved computing Z-score normalized DTW distance matrices, followed by hierarchical clustering. To validate the framework, we benchmarked it against a standard Pearson Correlation method. Results show that the unsupervised DTW framework achieved a network identification accuracy of 53.17%, significantly outperforming the standard Pearson correlation benchmark (48.06%) with a statistically significant difference (p < 0.05). The framework successfully detected distinct, somatotopically correct modulations: superior-medial activation during foot tapping and lateralized activation during hand tapping. These findings demonstrate that unsupervised DTW clustering is a robust, data-driven approach that outperforms conventional linear methods in capturing functional networks during motor tasks, showing significant potential for next-generation asynchronous BCIs.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。