1. Probabilistic quotient normalization as robust method to account for dilution of complex biological mixtures. application in [sup.1]H NMR metabonomics
- Author
-
Dieterle, Frank, Ross, Alfred, Schlotterbeck, Gotz, and Senn, Hans
- Subjects
Metabonomic analysis -- Spectra ,Nuclear magnetic resonance -- Observations ,Chemistry - Abstract
For the analysis of the spectra of complex biofluids, preprocessing methods play a crucial role in rendering the subsequent data analyses more robust and accurate. Normalization is a preprocessing method, which accounts for different dilutions of samples by scaling the spectra to the same virtual overall concentration. In the field of [sup.1]H NMR metabonomics integral normalization, which scales spectra to the same total integral, is the de facto standard. In this work, it is shown that integral normalization is a suboptimal method for normalizing spectra from metabonomic studies. Especially strong metabonomic changes, evident as massive amounts of single metabolites in samples, significantly hamper the integral normalization resulting in incorrectly scaled spectra. The probabilistic quotient normalization is introduced in this work. This method is based on the calculation of a most probable dilution factor by looking at the distribution of the quotients of the amplitudes of a test spectrum by those of a reference spectrum. Simulated spectra, spectra of urine samples from a metabonomic study with cyclosporin-A as the active compound, and spectra of more than 4000 samples of control animals demonstrate that the probabilistic quotient normalization is by far more robust and more accurate than the widespread integral normalization and vector length normalization.
- Published
- 2006