Back to Search Start Over

Assessing parameter efficient methods for pre-trained language model in annotating scRNA-seq data.

Authors :
Xia, Yucheng
Liu, Yuhang
Li, Tianhao
He, Sihan
Chang, Hong
Wang, Yaqing
Zhang, Yongqing
Ge, Wenyi
Source :
Methods. Aug2024, Vol. 228, p12-21. 10p.
Publication Year :
2024

Abstract

Annotating cell types of single-cell RNA sequencing (scRNA-seq) data is crucial for studying cellular heterogeneity in the tumor microenvironment. Recently, large-scale pre-trained language models (PLMs) have achieved significant progress in cell-type annotation of scRNA-seq data. This approach effectively addresses previous methods' shortcomings in performance and generalization. However, fine-tuning PLMs for different downstream tasks demands considerable computational resources, rendering it impractical. Hence, a new research branch introduces parameter-efficient fine-tuning (PEFT). This involves optimizing a few parameters while leaving the majority unchanged, leading to substantial reductions in computational expenses. Here, we utilize scBERT, a large-scale pre-trained model, to explore the capabilities of three PEFT methods in scRNA-seq cell type annotation. Extensive benchmark studies across several datasets demonstrate the superior applicability of PEFT methods. Furthermore, downstream analysis using models obtained through PEFT showcases their utility in novel cell type discovery and model interpretability for potential marker genes. Our findings underscore the considerable potential of PEFT in PLM-based cell type annotation, presenting novel perspectives for the analysis of scRNA-seq data. • Employ scBERT to investigate three parameter-efficient fine-tuning (PEFT) methods. • Investigation of the PEFT methods in discovering new cell types. • Exploration of the interpretability of models fine-tuned using PEFT methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10462023
Volume :
228
Database :
Academic Search Index
Journal :
Methods
Publication Type :
Academic Journal
Accession number :
177847780
Full Text :
https://doi.org/10.1016/j.ymeth.2024.05.007