Out of distribution detection with attention head masking for multimodal document classification

基于注意力头掩蔽的多模态文档分类分布外检测

阅读:1

Abstract

Detecting out-of-distribution (OOD) data is critical for ensuring the reliability and safety of deployed machine learning systems by mitigating model overconfidence and misclassification. While existing OOD detection methods primarily focus on uni-modal inputs, such as images or text, their effectiveness in multi-modal settings, particularly documents, remains underexplored. Moreover, most approaches prioritize decision mechanisms over optimizing the underlying dense embedding representations for optimal separation. In this work, we introduce Attention Head Masking (AHM), a novel technique applied to Transformer-based models for both uni-modal and multi-modal OOD detection. Our empirical results demonstrate that AHM enhances embedding quality, significantly improving the separation between in-distribution and OOD data. Notably, our method reduces the false positive rate (FPR) by up to 10%, outperforming state-of-the-art approaches. Furthermore, AHM generalizes effectively to multi-modal document data, where textual and visual information are jointly modeled within a Transformer architecture. To encourage further research in this area, we introduce FinanceDocs, a high-quality, publicly available document AI dataset tailored for OOD detection. Our code and dataset is available at https://github.com/constantinouchristos/OOD-AHM .

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。