Back to Search Start Over

RetMIL: Retentive Multiple Instance Learning for Histopathological Whole Slide Image Classification

Authors :
Chu, Hongbo
Sun, Qiehe
Li, Jiawen
Chen, Yuxuan
Zhang, Lizhong
Guan, Tian
Han, Anjia
He, Yonghong
Publication Year :
2024

Abstract

Histopathological whole slide image (WSI) analysis with deep learning has become a research focus in computational pathology. The current paradigm is mainly based on multiple instance learning (MIL), in which approaches with Transformer as the backbone are well discussed. These methods convert WSI tasks into sequence tasks by representing patches as tokens in the WSI sequence. However, the feature complexity brought by high heterogeneity and the ultra-long sequences brought by gigapixel size makes Transformer-based MIL suffer from the challenges of high memory consumption, slow inference speed, and lack of performance. To this end, we propose a retentive MIL method called RetMIL, which processes WSI sequences through hierarchical feature propagation structure. At the local level, the WSI sequence is divided into multiple subsequences. Tokens of each subsequence are updated through a parallel linear retention mechanism and aggregated utilizing an attention layer. At the global level, subsequences are fused into a global sequence, then updated through a serial retention mechanism, and finally the slide-level representation is obtained through a global attention pooling. We conduct experiments on two public CAMELYON and BRACS datasets and an public-internal LUNG dataset, confirming that RetMIL not only achieves state-of-the-art performance but also significantly reduces computational overhead. Our code will be accessed shortly.<br />Comment: under review

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.10858
Document Type :
Working Paper