Abstract
In medical image analysis, acquiring large-scale labeled datasets remains challenging, and images often exhibit high overall similarity that requires expert-level interpretation, differing substantially from natural image processing. To address these issues, we introduce the Information-Gated Memory (IGM) unit, a memory mechanism that enables deep networks to store and compare category-specific information. Unlike traditional CNNs or RNNs, the IGM unit performs memory-guided contrastive matching, allowing the network to focus on diagnostically relevant features and enhance classification performance. Using a CBCT dataset of 392 individuals, divided according to the presence or absence of artifacts, the proposed IGMNN achieved classification accuracies of [Formula: see text] and [Formula: see text], respectively.