Back to Search
Start Over
MGARD+: Optimizing Multilevel Methods for Error-Bounded Scientific Data Reduction.
- Source :
- IEEE Transactions on Computers; Jul2022, Vol. 71 Issue 7, p1522-1536, 15p
- Publication Year :
- 2022
-
Abstract
- Nowadays, data reduction is becoming increasingly important in dealing with the large amounts of scientific data. Existing multilevel compression algorithms offer a promising way to manage scientific data at scale, but may suffer from relatively low performance and reduction quality. In this paper, we propose MGARD+, a multilevel data reduction and refactoring framework drawing on previous multilevel methods, to achieve high-performance data decomposition and high-quality error-bounded lossy compression. Our contributions are four-fold: 1) We propose to leverage a level-wise coefficient quantization method, which uses different error tolerances to quantize the multilevel coefficients. 2) We propose an adaptive decomposition method which treats the multilevel decomposition as a preconditioner and terminates the decomposition process at an appropriate level. 3) We leverage a set of algorithmic optimization strategies to significantly improve the performance of multilevel decomposition/recomposition. 4) We evaluate our proposed method using four real-world scientific datasets and compare with several state-of-the-art lossy compressors. Experiments demonstrate that our optimizations improve the decomposition/recomposition performance of the existing multilevel method by up to $70 \times$ 70 × , and the proposed compression method can improve compression ratio by up to $2 \times$ 2 × compared with other state-of-the-art error-bounded lossy compressors under the same level of data distortion. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00189340
- Volume :
- 71
- Issue :
- 7
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Computers
- Publication Type :
- Academic Journal
- Accession number :
- 157325211
- Full Text :
- https://doi.org/10.1109/TC.2021.3092201