Back to Search
Start Over
Event-Driven Heterogeneous Network for Video Deraining.
- Source :
-
International Journal of Computer Vision . Dec2024, Vol. 132 Issue 12, p5841-5861. 21p. - Publication Year :
- 2024
-
Abstract
- Restoring clear frames from rainy videos poses a significant challenge due to the swift motion of rain streaks. Traditional frame-based visual sensors, which record dense scene content synchronously, struggle to capture the fast-moving rain information. Conversely, the novel bio-inspired event camera, known for its high temporal resolution and low latency, effectively records the motion trajectories of rapidly falling rain through asynchronously generated sparse event sequences. In light of these attributes, we introduce a novel event-driven convolutional spiking network for video deraining. For video restoration, we employ a Convolutional Neural Network as the backbone, extracting dense features from video sequences to map rain-soaked frames to clear ones. To remove rain, we meticulously design a bio-inspired Spiking Neural Network that adapts to the sparse event sequences, capturing features of falling rain. We then establish a bimodal feature fusion module that combines dense convolutional features with sparse spiking features. This fusion aids the backbone in accurately pinpointing rain streaks across spatiotemporal dimensions. Thus, our diverse network extracts and collaborates information from both events and videos, enhancing deraining performance. Experiments conducted on synthetic and real-world datasets prove that our network markedly surpasses existing video deraining techniques. [ABSTRACT FROM AUTHOR]
- Subjects :
- *CONVOLUTIONAL neural networks
*VIDEOS
Subjects
Details
- Language :
- English
- ISSN :
- 09205691
- Volume :
- 132
- Issue :
- 12
- Database :
- Academic Search Index
- Journal :
- International Journal of Computer Vision
- Publication Type :
- Academic Journal
- Accession number :
- 180936122
- Full Text :
- https://doi.org/10.1007/s11263-024-02148-x