Complex & Intelligent Systems (Oct 2023)

Primary sequence based protein–protein interaction binder generation with transformers

  • Junzheng Wu,
  • Eric Paquet,
  • Herna L. Viktor,
  • Wojtek Michalowski

DOI
https://doi.org/10.1007/s40747-023-01237-7
Journal volume & issue
Vol. 10, no. 2
pp. 2067 – 2082

Abstract

Read online

Abstract The design of binder proteins for specific target proteins using deep learning is a challenging task that has a wide range of applications in both designing therapeutic antibodies and creating new drugs. Machine learning-based solutions, as opposed to laboratory design, streamline the design process and enable the design of new proteins that may be required to address new and orphan diseases. Most techniques proposed in the literature necessitate either domain knowledge or some appraisal of the target protein’s 3-D structure. This paper proposes an approach for designing binder proteins based solely on the amino acid sequence of the target protein and without recourse to domain knowledge or structural information. The sequences of the binders are generated with two new transformers, namely the AppendFormer and MergeFormer architectures. Because, in general, there is more than one binder for a given target protein, these transformers employ a binding score and a prior on the sequence of the binder to obtain a unique targeted solution. Our experimental evaluation confirms the strengths of this novel approach. The performance of the models was determined with 5-fold cross-validation and clearly indicates that our architectures lead to highly accurate results. In addition, scores of up to 0.98 were achieved in terms of Needleman-Wunsch and Smith-Waterman similarity metrics, which indicates that our solutions significantly outperform a seq2seq baseline model.

Keywords