Nature Communications (Oct 2024)

A long-context language model for deciphering and generating bacteriophage genomes

  • Bin Shao,
  • Jiawei Yan

DOI
https://doi.org/10.1038/s41467-024-53759-4
Journal volume & issue
Vol. 15, no. 1
pp. 1 – 7

Abstract

Read online

Abstract Inspired by the success of large language models (LLMs), we develop a long-context generative model for genomes. Our multiscale transformer model, megaDNA, is pre-trained on unannotated bacteriophage genomes with nucleotide-level tokenization. We demonstrate the foundational capabilities of our model including the prediction of essential genes, genetic variant effects, regulatory element activity and taxonomy of unannotated sequences. Furthermore, it generates de novo sequences up to 96 K base pairs, which contain potential regulatory elements and annotated proteins with phage-related functions.