A hybrid intrusion detection model based on dynamic spatial-temporal graph neural network in in-vehicle networks

基于动态时空图神经网络的车载网络混合入侵检测模型

阅读:1

Abstract

With the increasing complexity of the Internet of Vehicles (IoV) architecture and the continuous evolution of attack techniques, in-vehicle networks are confronted with unprecedented security challenges, while existing intrusion detection systems (IDSs) still exhibit multiple limitations in IoV scenarios. First, traditional IDSs often neglect potential spatial-temporal dependencies in network traffic, leading to insufficient modeling capability for sophisticated attack behaviors. Second, there remains a lack of hybrid IDS capable of simultaneously addressing both intra-vehicle and external network attacks, as their detection capabilities are typically confined to a single environment or attack type. This paper proposes GCN-2-Former, an innovative spatial-temporal model that utilizes a Graph Convolutional Network (GCN) and a transformer. The model employs a sliding window mechanism and dynamic graph construction strategy to map heterogeneous network traffic into spatial-temporal graph structures. Local spatial features are extracted via GCN, while multi-layer Transformer modules are introduced to model global temporal dependencies. Furthermore, a graph-level feature fusion strategy is adopted to effectively integrate spatial and temporal characteristics. Experimental results indicate that the proposed model achieves an accuracy and F1-score of 99.98% on the CICIDS2017 dataset, which represents external network attacks, and a detection rate of 100% on the Car Hacking dataset, which represents intra-vehicle network attacks. It significantly outperforms existing mainstream methods, demonstrating excellent detection capability, robustness, and cross-domain generalization performance.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。