Back to Search Start Over

Best Frame Selection to Enhance Training Step Efficiency in Video-Based Human Action Recognition

Authors :
Abdorreza Alavi Gharahbagh
Vahid Hajihashemi
Marta Campos Ferreira
José J. M. Machado
João Manuel R. S. Tavares
Source :
Applied Sciences, Vol 12, Iss 4, p 1830 (2022)
Publication Year :
2022
Publisher :
MDPI AG, 2022.

Abstract

In recent years, with the growth of digital media and modern imaging equipment, the use of video processing algorithms and semantic film and image management has expanded. The usage of different video datasets in training artificial intelligence algorithms is also rapidly expanding in various fields. Due to the high volume of information in a video, its processing is still expensive for most hardware systems, mainly in terms of its required runtime and memory. Hence, the optimal selection of keyframes to minimize redundant information in video processing systems has become noteworthy in facilitating this problem. Eliminating some frames can simultaneously reduce the required computational load, hardware cost, memory and processing time of intelligent video-based systems. Based on the aforementioned reasons, this research proposes a method for selecting keyframes and adaptive cropping input video for human action recognition (HAR) systems. The proposed method combines edge detection, simple difference, adaptive thresholding and 1D and 2D average filter algorithms in a hierarchical method. Some HAR methods are trained with videos processed by the proposed method to assess its efficiency. The results demonstrate that the application of the proposed method increases the accuracy of the HAR system by up to 3% compared to random image selection and cropping methods. Additionally, for most cases, the proposed method reduces the training time of the used machine learning algorithm.

Details

Language :
English
ISSN :
20763417
Volume :
12
Issue :
4
Database :
Directory of Open Access Journals
Journal :
Applied Sciences
Publication Type :
Academic Journal
Accession number :
edsdoj.6a0cbc2e051845d1a73f868b1781f376
Document Type :
article
Full Text :
https://doi.org/10.3390/app12041830