Back to Search
Start Over
Digital video tampering detection using texture with compressed passive technic.
- Source :
-
AIP Conference Proceedings . 2024, Vol. 2802 Issue 1, p1-8. 8p. - Publication Year :
- 2024
-
Abstract
- Now a days, video recordings can be effortlessly recorded and produced through easy to use altering instruments. These videos shared on social network for harassment and interpersonal organizations to make untruthful publicity. During the cycle of spatial forgery, the surface and miniature examples of the casings become conflicting, which can be seen in the distinction of two sequential edges. It is extremely difficult to detect the forgery in videos with naked eyes. In splicing method, the tempering is performed with the objects from other videos. In this article, we have proposed fabrication identification technique that help to identify video forgery and the spatial space of recordings. Furthermore, high pass filter layers and max-pooling have Utilized to lessen the computational confusions and to upgrade the leftover, which stays during the fraud method separately. Additionally, the proposed technique represents the deep structure of the forged video frames and efficiently classify the video as authentic or forged. Great exactness is accomplished when contrasted with cutting edge techniques. However, proposed method will not work on such videos in which there is no movement of the objects. Moreover, I have perform experiment in which objects is remove and replace by the use of pen tool. Finally, reached at the result numerous video editing tools like Adobe's (Premier & After Effect), Pen tool, Premier and Vegas are available easily which can tamper with video. [ABSTRACT FROM AUTHOR]
- Subjects :
- *VIDEO editing
*FRAUD
*VIDEO recording
*SOCIAL networks
*DIGITAL video
*FORGERY
Subjects
Details
- Language :
- English
- ISSN :
- 0094243X
- Volume :
- 2802
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- AIP Conference Proceedings
- Publication Type :
- Conference
- Accession number :
- 175035834
- Full Text :
- https://doi.org/10.1063/5.0181757