Abstract
To address the critical challenge of bidirectional dependency modeling in air traffic control (ATC) communications - a gap identified in state-of-the-art joint extraction models like CII - this study proposes CII-BERT, an enhanced framework integrating BERT's contextual representation with task-specific DNN layers. Motivated by the need to reduce aviation risks caused by instruction misinterpretation, we define "flight safety" through two quantifiable metrics: conflict probability reduction and instruction error rate. Our approach uniquely employs continuous pre-training with domain-adapted augmentation strategies (synonym substitution and random block exchange), enabling robust learning of ATC-specific syntax and terminologies. Evaluated on a diverse control directive dataset (6,000 + samples across 7 operational scenarios), CII-BERT achieves 99.44% intent recognition accuracy and 99.23% information extraction accuracy - outperforming CII by 1.71 and 1.58%, respectively. Crucially, error analysis confirms a 37% reduction in high-risk edge cases (e.g., ambiguous altitude commands) compared to baseline models. This demonstrates CII-BERT's potential to enhance smart airport operations not only in safety but also in controller workload optimization and runway efficiency.