Back to Search
Start Over
Consistent estimation of the missing mass for feature models
- Publication Year :
- 2019
-
Abstract
- Feature models are popular in machine learning and they have been recently used to solve many unsupervised learning problems. In these models every observation is endowed with a finite set of features, usually selected from an infinite collection $(F_{j})_{j\geq 1}$. Every observation can display feature $F_{j}$ with an unknown probability $p_{j}$. A statistical problem inherent to these models is how to estimate, given an initial sample, the conditional expected number of hitherto unseen features that will be displayed in a future observation. This problem is usually referred to as the missing mass problem. In this work we prove that, using a suitable multiplicative loss function and without imposing any assumptions on the parameters $p_{j}$, there does not exist any universally consistent estimator for the missing mass. In the second part of the paper, we focus on a special class of heavy-tailed probabilities $(p_{j})_{j\geq 1}$, which are common in many real applications, and we show that, within this restricted class of probabilities, the nonparametric estimator of the missing mass suggested by Ayed et al. (2017) is strongly consistent. As a byproduct result, we will derive concentration inequalities for the missing mass and the number of features observed with a specified frequency in a sample of size $n$.
- Subjects :
- Mathematics - Statistics Theory
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1902.10530
- Document Type :
- Working Paper