Back to Search Start Over

Understanding the Natural Language of DNA using Encoder-Decoder Foundation Models with Byte-level Precision

Authors :
Malusare, Aditya
Kothandaraman, Harish
Tamboli, Dipesh
Lanman, Nadia A.
Aggarwal, Vaneet
Publication Year :
2023

Abstract

This paper presents the Ensemble Nucleotide Byte-level Encoder-Decoder (ENBED) foundation model, analyzing DNA sequences at byte-level precision with an encoder-decoder Transformer architecture. ENBED uses a sub-quadratic implementation of attention to develop an efficient model capable of sequence-to-sequence transformations, generalizing previous genomic models with encoder-only or decoder-only architectures. We use Masked Language Modeling to pre-train the foundation model using reference genome sequences and apply it in the following downstream tasks: (1) identification of enhancers, promotors and splice sites, (2) recognition of sequences containing base call mismatches and insertion/deletion errors, an advantage over tokenization schemes involving multiple base pairs, which lose the ability to analyze with byte-level precision, (3) identification of biological function annotations of genomic sequences, and (4) generating mutations of the Influenza virus using the encoder-decoder architecture and validating them against real-world observations. In each of these tasks, we demonstrate significant improvement as compared to the existing state-of-the-art results.<br />Comment: Accepted to OUP Bioinformatics Advances

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.02333
Document Type :
Working Paper
Full Text :
https://doi.org/10.1093/bioadv/vbae117