Nature Communications (Nov 2024)

Interformer: an interaction-aware model for protein-ligand docking and affinity prediction

  • Houtim Lai,
  • Longyue Wang,
  • Ruiyuan Qian,
  • Junhong Huang,
  • Peng Zhou,
  • Geyan Ye,
  • Fandi Wu,
  • Fang Wu,
  • Xiangxiang Zeng,
  • Wei Liu

DOI
https://doi.org/10.1038/s41467-024-54440-6
Journal volume & issue
Vol. 15, no. 1
pp. 1 – 12

Abstract

Read online

Abstract In recent years, the application of deep learning models to protein-ligand docking and affinity prediction, both vital for structure-based drug design, has garnered increasing interest. However, many of these models overlook the intricate modeling of interactions between ligand and protein atoms in the complex, consequently limiting their capacity for generalization and interpretability. In this work, we propose Interformer, a unified model built upon the Graph-Transformer architecture. The proposed model is designed to capture non-covalent interactions utilizing an interaction-aware mixture density network. Additionally, we introduce a negative sampling strategy, facilitating an effective correction of interaction distribution for affinity prediction. Experimental results on widely used and our in-house datasets demonstrate the effectiveness and universality of the proposed approach. Extensive analyses confirm our claim that our approach improves performance by accurately modeling specific protein-ligand interactions. Encouragingly, our approach advances docking tasks state-of-the-art (SOTA) performance.