AutoLDT: a lightweight spatio-temporal decoupling transformer framework with AutoML method for time series classification

AutoLDT:一种轻量级的时空解耦Transformer框架,结合AutoML方法用于时间序列分类

阅读:1

Abstract

Time series classification finds widespread applications in civil, industrial, and military fields, while the classification performance of time series models has been improving with the recent development of deep learning. However, the issues of feature extraction effectiveness, model complexity, and model design uncertainty constrain the further development of time series classification. To address the above issues, we propose a Lightweight Spatio-Temporal Decoupling Transformer framework based on Automated Machine Learning technique (AutoLDT). The framework proposes a novel lightweight Transformer with fuzzy position encoding, TS-separable linear self-attention mechanism, and convolutional feedforward network, which mine the temporal and spatial features, as well as the local and global relationship of time series. Fuzzy positional encoding integrates fuzzy ideas to enhance the generalization performance of model information mining. TS-separable linear self-attention mechanism and convolutional feedforward network achieve feature extraction in a lightweight way by decoupling temporal and spatial features of time series. Notably, we adopt the Covariance Matrix Adaptation Evolution Strategy and global adaptive pruning technique to realize automated network structure design, which further improves the model training efficiency and automation, and avoids the uncertainty problem of network design. Finally, we validate the effectiveness of the proposed framework on publicly available UCR and UEA time series datasets. The experimental results show that the proposed framework not only improves the model classification performance in a lightweight way but also dramatically improves the model training efficiency.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。