1. Hierarchical BoW with segmental sparse coding for large scale image classification and retrieval
- Author
-
Jianshe Zhou, Jie Liu, Sheng Tang, and Narentuya
- Subjects
Vocabulary ,Computer Networks and Communications ,Computer science ,media_common.quotation_subject ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,02 engineering and technology ,Inverted index ,Regularization (mathematics) ,0202 electrical engineering, electronic engineering, information engineering ,Media Technology ,Visual Word ,Image retrieval ,media_common ,Contextual image classification ,business.industry ,Quantization (signal processing) ,020206 networking & telecommunications ,Pattern recognition ,Sparse approximation ,ComputingMethodologies_PATTERNRECOGNITION ,Categorization ,Hardware and Architecture ,020201 artificial intelligence & image processing ,Artificial intelligence ,Neural coding ,business ,Software - Abstract
The bag-of-words (BoW) has been widely regarded as the most successful algorithms for content-based image related tasks, such as large scale image retrieval, classification, and object categorization. Large visual words acquired by BoW quantization through large vocabulary or codebooks have been receiving much attention in the past years. However, not only construction of large vocabulary but also the quantization process impose a heavy burden in terms of time and memory complexities. In order to tackle this issue, we propose an efficient hierarchical BoW (HBoW) to achieve large visual words through quantization by a compact vocabulary instead of large vocabulary. Our vocabulary is very compact since it is only composed of two small dictionaries which is learned through segmental sparse decomposition of local features. To generate the BoW with large size, we first divide the local features into two half parts, and use the two small dictionaries to compute their sparse codes. Then, we map the two indices of the maximum elements of the two sparse codes to a large set of visual words based upon the fact that data with similar properties will share the same base weighted with the largest sparse coefficient. To further make similar patches have higher probability of select the same dictionary base to get similar BoW vectors, we propose a novel collaborative dictionary learning method by imposing the similarity regularization factor together with the row sparsity regularization across data instances during group sparse coding. Additionally, based on index combination of top-2 large sparse codes of local descriptors, we propose a soft BoW assignment method so that our proposed HBoW can tolerate different word selection for similar patches. By employing the inverted file structure built through our HBoW, K-nearest neighbors (KNN) can be efficiently retrieved. After incorporation of our fast KNN search into the SVM-KNN classification method, our HBoW can be used for efficient image classification and logo recognition. Experiments on serval well-known datasets show that our approach is effective for large scale image classification and retrieval.
- Published
- 2018
- Full Text
- View/download PDF