Back to Search Start Over

Efficient video segment matching for detecting temporal-based video copies

Authors :
Chiu, Chih-Yi
Tsai, Tsung-Han
Hsieh, Cheng-Yu
Source :
Neurocomputing. Apr2013, Vol. 105, p70-80. 11p.
Publication Year :
2013

Abstract

Abstract: Content-based video copy detection has grabbed an increasing attention in the video search community due to the rapid proliferation of video copies over the Internet. Most existing techniques of video copy detection focus on spatial-based video transformations such as brightness enhancement and caption superimposition. It can be accomplished efficiently by the clip-level matching technique, which summarizes the full content of a video clip as a single signature. However, temporal-based transformations involving random insertion and deletion operations pose a great challenge to clip-level matching. Although some studies employ the frame-level matching technique to deal with temporal-based transformations, the high computation complexity might make them impractical in real applications. In this paper, we present a novel search method to address the above-mentioned problems. For a given query video clip, it is partitioned into short segments, then each of which linearly scans over the video clips in a dataset. Rather than exhaustive search, we derive the similarity upper bounds of these query segments as a filter to skip unnecessary matching. In addition, we present a min-hash-based inverted indexing mechanism to find candidate clips from the dataset. Our experimental results demonstrate that the proposed method is robust and efficient to deal with temporal-based video copies. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
09252312
Volume :
105
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
86408390
Full Text :
https://doi.org/10.1016/j.neucom.2012.04.036