1. Small Data Challenges in Big Data Era: A Survey of Recent Progress on Unsupervised and Semi-Supervised Methods
- Author
-
Jiebo Luo and Guo-Jun Qi
- Subjects
Computer Science::Machine Learning ,FOS: Computer and information sciences ,Big Data ,Computer science ,Computer Vision and Pattern Recognition (cs.CV) ,Big data ,Computer Science - Computer Vision and Pattern Recognition ,02 engineering and technology ,Statistics::Machine Learning ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,Small data ,business.industry ,Applied Mathematics ,Data science ,ComputingMethodologies_PATTERNRECOGNITION ,Computational Theory and Mathematics ,Principles of learning ,Labeled data ,020201 artificial intelligence & image processing ,Neural Networks, Computer ,Supervised Machine Learning ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,Feature learning ,Algorithms ,Software ,Generative grammar - Abstract
Representation learning with small labeled data have emerged in many problems, since the success of deep neural networks often relies on the availability of a huge amount of labeled data that is expensive to collect. To address it, many efforts have been made on training sophisticated models with few labeled data in an unsupervised and semi-supervised fashion. In this paper, we will review the recent progresses on these two major categories of methods. A wide spectrum of models will be categorized in a big picture, where we will show how they interplay with each other to motivate explorations of new ideas. We will review the principles of learning the transformation equivariant, disentangled, self-supervised and semi-supervised representations, all of which underpin the foundation of recent progresses. Many implementations of unsupervised and semi-supervised generative models have been developed on the basis of these criteria, greatly expanding the territory of existing autoencoders, generative adversarial nets (GANs) and other deep networks by exploring the distribution of unlabeled data for more powerful representations. We will discuss emerging topics by revealing the intrinsic connections between unsupervised and semi-supervised learning, and propose in future directions to bridge the algorithmic and theoretical gap between transformation equivariance for unsupervised learning and supervised invariance for supervised learning, and unify unsupervised pretraining and supervised finetuning. We will also provide a broader outlook of future directions to unify transformation and instance equivariances for representation learning, connect unsupervised and semi-supervised augmentations, and explore the role of the self-supervised regularization for many learning problems., Comment: published in IEEE Transactions on Pattern Analysis and Machine Intelligence
- Published
- 2022