Back to Search
Start Over
Lazy Resampling: Fast and information preserving preprocessing for deep learning.
- Source :
-
Computer methods and programs in biomedicine [Comput Methods Programs Biomed] 2024 Sep 19; Vol. 257, pp. 108422. Date of Electronic Publication: 2024 Sep 19. - Publication Year :
- 2024
- Publisher :
- Ahead of Print
-
Abstract
- Background and Objective: Preprocessing of data is a vital step for almost all deep learning workflows. In computer vision, manipulation of data intensity and spatial properties can improve network stability and can provide an important source of generalisation for deep neural networks. Models are frequently trained with preprocessing pipelines composed of many stages, but these pipelines come with a drawback; each stage that resamples the data costs time, degrades image quality, and adds bias to the output. Long pipelines can also be complex to design, especially in medical imaging, where cropping data early can cause significant artifacts.<br />Methods: We present Lazy Resampling, a software that rephrases spatial preprocessing operations as a graphics pipeline. Rather than each transform individually modifying the data, the transforms generate transform descriptions that are composited together into a single resample operation wherever possible. This reduces pipeline execution time and, most importantly, limits signal degradation. It enables simpler pipeline design as crops and other operations become non-destructive. Lazy Resampling is designed in such a way that it provides the maximum benefit to users without requiring them to understand the underlying concepts or change the way that they build pipelines.<br />Results: We evaluate Lazy Resampling by comparing traditional pipelines and the corresponding lazy resampling pipeline for the following tasks on Medical Segmentation Decathlon datasets. We demonstrate lower information loss in lazy pipelines vs. traditional pipelines. We demonstrate that Lazy Resampling can avoid catastrophic loss of semantic segmentation label accuracy occurring in traditional pipelines when passing labels through a pipeline and then back through the inverted pipeline. Finally, we demonstrate statistically significant improvements when training UNets for semantic segmentation.<br />Conclusion: Lazy Resampling reduces the loss of information that occurs when running processing pipelines that traditionally have multiple resampling steps and enables researchers to build simpler pipelines by making operations such as rotation and cropping effectively non-destructive. It makes it possible to invert labels back through a pipeline without catastrophic loss of accuracy. A reference implementation for Lazy Resampling can be found at https://github.com/KCL-BMEIS/LazyResampling. Lazy Resampling is being implemented as a core feature in MONAI, an open source python-based deep learning library for medical imaging, with a roadmap for a full integration.<br />Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this article.<br /> (Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.)
Details
- Language :
- English
- ISSN :
- 1872-7565
- Volume :
- 257
- Database :
- MEDLINE
- Journal :
- Computer methods and programs in biomedicine
- Publication Type :
- Academic Journal
- Accession number :
- 39395305
- Full Text :
- https://doi.org/10.1016/j.cmpb.2024.108422