Back to Search Start Over

SuperMix: Sparse Regularization for Mixtures

Authors :
de Castro, Yohann
Gadat, Sébastien
Marteau, Clément
Maugis, Cathy
Publication Year :
2019

Abstract

This paper investigates the statistical estimation of a discrete mixing measure $\mu$0 involved in a kernel mixture model. Using some recent advances in l1-regularization over the space of measures, we introduce a "data fitting and regularization" convex program for estimating $\mu$0 in a grid-less manner from a sample of mixture law, this method is referred to as Beurling-LASSO. Our contribution is twofold: we derive a lower bound on the bandwidth of our data fitting term depending only on the support of $\mu$0 and its so-called "minimum separation" to ensure quantitative support localization error bounds; and under a so-called "non-degenerate source condition" we derive a non-asymptotic support stability property. This latter shows that for a sufficiently large sample size n, our estimator has exactly as many weighted Dirac masses as the target $\mu$0 , converging in amplitude and localization towards the true ones. Finally, we also introduce some tractable algorithms for solving this convex program based on "Sliding Frank-Wolfe" or "Conic Particle Gradient Descent". Statistical performances of this estimator are investigated designing a so-called "dual certificate", which is appropriate to our setting. Some classical situations, as e.g. mixtures of super-smooth distributions (e.g. Gaussian distributions) or ordinary-smooth distributions (e.g. Laplace distributions), are discussed at the end of the paper.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1907.10592
Document Type :
Working Paper