International Journal of Computational Intelligence Systems (Jun 2024)

MocFormer: A Two-Stage Pre-training-Driven Transformer for Drug–Target Interactions Prediction

  • Yi-Lun Zhang,
  • Wen-Tao Wang,
  • Jia-Hui Guan,
  • Deepak Kumar Jain,
  • Tian-Yang Wang,
  • Swalpa Kumar Roy

DOI
https://doi.org/10.1007/s44196-024-00561-1
Journal volume & issue
Vol. 17, no. 1
pp. 1 – 11

Abstract

Read online

Abstract Drug–target interactions is essential for advancing pharmaceuticals. Traditional drug–target interaction studies rely on labor-intensive laboratory techniques. Still, recent advancements in computing power have elevated the importance of deep learning methods, offering faster, more precise, and cost-effective screening and prediction. Nonetheless, general deep learning methods often yield low-confidence results due to the complex nature of drugs and proteins, bias, limited labeled data, and feature extraction challenges. To address these challenges, a novel two-stage pre-trained framework is proposed for drug–target interactions prediction. In the first stage, pre-trained molecule and protein models develop a comprehensive feature representation, enhancing the framework’s ability to handle drug and protein diversity. This also reduces bias, improving prediction accuracy. In the second stage, a transformer with bilinear pooling and a fully connected layer enables predictions based on feature vectors. Comprehensive experiments were conducted using public datasets from DrugBank and Epigenetic-regulators datasets to evaluate the framework’s effectiveness. The results demonstrate that the proposed framework outperforms the state-of-the-art methods regarding accuracy, area under the receiver operating characteristic curve, recall, and area under the precision-recall curve. The code is available at: https://github.com/DHCGroup/MocFormer .

Keywords