Back to Search
Start Over
Spatially Consistent Transformer for Colorization in Monochrome-Color Dual-Lens System.
- Source :
-
IEEE Transactions on Image Processing . 2022, Vol. 31, p6747-6760. 14p. - Publication Year :
- 2022
-
Abstract
- We study the colorization problem in monochrome-color dual-lens camera systems, i.e. colorizing the gray image from the monochrome camera using the color image from the color camera as reference. In related methods, cost volume based CNN methods achieve the state-of-the-art results, but they are costly in GPU memory due to building the 4D cost volume. Recently, some slice-wise cross-attention based methods are proposed for related problems. The slice-wise cross-attention has much less costs in GPU memory but directly using them for this colorization problem cannot generate competing results. We make use of the non-local computation property of cross-attention to propose a transformer based method. To overcome the limitations of straight-forward slice-wise cross-attention, we propose the spatially consistent cross-attention (SCCA) block to encourage pixels of slices across different epipolar lines in the gray image to find spatially consistent correspondence with pixels of the reference color image. And, to further reduce the memory cost while keeping the colorization accuracy, we design a pyramid processing strategy to cascade a series of SCCA blocks with smaller slice size and perform the colorization from coarse to fine. To extract more powerful image features, we use several regional self-attention (RSA) blocks with U-style connections. Experimental results show that we outperform the state-of-the-art methods largely on the synthesized datasets of Cityscapes, Sintel, and SceneFlow, and the real monochrome-color dual-lens dataset. [ABSTRACT FROM AUTHOR]
- Subjects :
- *IMAGE color analysis
*CONSTRUCTION cost estimates
Subjects
Details
- Language :
- English
- ISSN :
- 10577149
- Volume :
- 31
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Image Processing
- Publication Type :
- Academic Journal
- Accession number :
- 170077402
- Full Text :
- https://doi.org/10.1109/TIP.2022.3215910