Back to Search Start Over

Riboformer: a deep learning framework for predicting context-dependent translation dynamics

Authors :
Bin Shao
Jiawei Yan
Jing Zhang
Lili Liu
Ye Chen
Allen R. Buskirk
Source :
Nature Communications, Vol 15, Iss 1, Pp 1-10 (2024)
Publication Year :
2024
Publisher :
Nature Portfolio, 2024.

Abstract

Abstract Translation elongation is essential for maintaining cellular proteostasis, and alterations in the translational landscape are associated with a range of diseases. Ribosome profiling allows detailed measurements of translation at the genome scale. However, it remains unclear how to disentangle biological variations from technical artifacts in these data and identify sequence determinants of translation dysregulation. Here we present Riboformer, a deep learning-based framework for modeling context-dependent changes in translation dynamics. Riboformer leverages the transformer architecture to accurately predict ribosome densities at codon resolution. When trained on an unbiased dataset, Riboformer corrects experimental artifacts in previously unseen datasets, which reveals subtle differences in synonymous codon translation and uncovers a bottleneck in translation elongation. Further, we show that Riboformer can be combined with in silico mutagenesis to identify sequence motifs that contribute to ribosome stalling across various biological contexts, including aging and viral infection. Our tool offers a context-aware and interpretable approach for standardizing ribosome profiling datasets and elucidating the regulatory basis of translation kinetics.

Subjects

Subjects :
Science

Details

Language :
English
ISSN :
20411723
Volume :
15
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Nature Communications
Publication Type :
Academic Journal
Accession number :
edsdoj.4a19ce39273348e8a8e77454bb363d00
Document Type :
article
Full Text :
https://doi.org/10.1038/s41467-024-46241-8