Back to Search
Start Over
An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax
- Publication Year :
- 2016
- Publisher :
- arXiv, 2016.
-
Abstract
- A framework is presented for unsupervised learning of representations based on infomax principle for large-scale neural populations. We use an asymptotic approximation to the Shannon's mutual information for a large neural population to demonstrate that a good initial approximation to the global information-theoretic optimum can be obtained by a hierarchical infomax method. Starting from the initial solution, an efficient algorithm based on gradient descent of the final objective function is proposed to learn representations from the input datasets, and the method works for complete, overcomplete, and undercomplete bases. As confirmed by numerical experiments, our method is robust and highly efficient for extracting salient features from input datasets. Compared with the main existing methods, our algorithm has a distinct advantage in both the training speed and the robustness of unsupervised representation learning. Furthermore, the proposed method is easily extended to the supervised or unsupervised model for training deep structure networks.<br />Comment: 25 pages, 7 figures, 5th International Conference on Learning Representations (ICLR 2017)
- Subjects :
- FOS: Computer and information sciences
Computer Science - Learning
Artificial Intelligence (cs.AI)
Computer Science - Artificial Intelligence
Statistics - Machine Learning
Computer Science - Information Theory
Quantitative Biology - Neurons and Cognition
Information Theory (cs.IT)
FOS: Biological sciences
Neurons and Cognition (q-bio.NC)
Machine Learning (stat.ML)
Machine Learning (cs.LG)
Subjects
Details
- Database :
- OpenAIRE
- Accession number :
- edsair.doi.dedup.....8ebfb4086b9587e3d5c4c9f09441413f
- Full Text :
- https://doi.org/10.48550/arxiv.1611.01886