Abstract
Single-cell multi-omics data reveal complex cellular states and deepen our understanding of tissue cell phenotypes and functions. However, data analysis remains challenging due to the discrete nature and high noise level of the data, as well as the lack of modality. Here, we propose scMultiNet, a multi-task deep adversarial neural network that can integrate different tasks to analyze single-cell multi-modal data. In particular, we achieve joint training of multi-modal integration and cross-modal prediction tasks by introducing a cross-modal bi-prediction module and a multi-head self-attention module. Data denoising is further enhanced by integrating an indicator matrix that constrains and precisely reconstructs the original expression values. Extensive simulations and real data experiments demonstrate that scMultiNet outperforms existing state-of-the-art methods in dimensionality reduction, visualization, clustering, batch elimination, data denoising, multi-modal integration, single-cell cross-modality translation, and in revealing cell type-specific biological insights. In addition, we demonstrate that scMultiNet can effectively transfer the complex relationships between modalities from one batch to another. In summary, scMultiNet stands as a comprehensive end-to-end framework, ideally suited for analyzing single-cell multi-omics data.