Back to Search Start Over

Combination of spatio-temporal and transform domain for sparse occlusion estimation by optical flow.

Authors :
Chen, Pengguang
Zhang, Xingming
Yuen, Pong C.
Mao, Aihua
Source :
Neurocomputing. Nov2016, Vol. 214, p368-375. 8p.
Publication Year :
2016

Abstract

Lack of information in occluded regions leads to ambiguity inherent, which is a big challenge for motion estimation. Recently, the sparse model has been widely used since the essential content of the motion field could be effectively preserved with sparse representation. The methods exploiting sparsity acquire representations either directly in the spatio-temporal domain or indirectly in the transform domain. Usually, the sparse model with sparsifying transform is based on patches and thus is more robust agints noise, while the sparse model without sparsifying transform can directly work for an overall image treatment. Aiming at tackling the motion ambiguity efficiently, this paper employs a distinct sparse representation model into a variational framework for estimating occlusion with optical flow. In order to deal with dictionary learning which is computationally expensive and requires a preprocess for extending the sparsifying transform model for arbitrary image sizes, we present a new unified framework to directly generate an overall dictionary via the sparse model without sparsifying transform, and then optimize for small size dictionaries over corresponding patches with the overall dictionary. Our framework is based on the Stein–Weiss analysis function acting as a novel regulariser and a sparsifying transform function respectively in variational and sparsity models. Experiments show that the proposed method outperforms the existing estimation methods of jointing occlusion and optical flow. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
214
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
118813646
Full Text :
https://doi.org/10.1016/j.neucom.2016.06.025