SCATrans: semantic cross-attention transformer for drug-drug interaction predication through multimodal biomedical data

SCATrans:基于多模态生物医学数据的药物相互作用预测语义交叉注意力转换器

阅读:1

Abstract

Predicting potential drug-drug interactions (DDIs) from biomedical data plays a critical role in drug therapy, drug development, drug regulation, and public health. However, it remains challenging due to the large number of possible drug combinations, and multimodal biomedical data, which is disorder, imbalanced, more prone to linguistic errors, and difficult to label. A Semantic Cross-Attention Transformer (SCAT) model is constructed to address the above challenge. In the model, BioBERT, Doc2Vec and graph convolutional network are utilized to embed the multimodal biomedical data into vector representation, BiGRU is adopted to capture contextual dependencies in both forward and backward directions, Cross-Attention is employed to integrate the extracted features and explicitly model dependencies between them, and a feature-joint classifier is adopted to implement DDI predication (DDIP). The experiment results on the DDIExtraction-2013 dataset demonstrate that SCAT outperforms the state-of-the-art DDIP approaches. SCAT expands the application of multimodal deep learning in the field of multimodal DDIP, and can be applied to drug regulation systems to predict novel DDIs and DDI-related events.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。