Back to Search
Start Over
Perpendicular-Cutdepth: Perpendicular Direction Depth Cutting Data Augmentation Method.
- Source :
- Computers, Materials & Continua; 2024, Vol. 79 Issue 1, p927-941, 15p
- Publication Year :
- 2024
-
Abstract
- Depth estimation is an important task in computer vision. Collecting data at scale for monocular depth estimation is challenging, as this task requires simultaneously capturing RGB images and depth information. Therefore, data augmentation is crucial for this task. Existing data augmentationmethods often employ pixel-wise transformations, whichmay inadvertently disrupt edge features. In this paper, we propose a data augmentationmethod formonocular depth estimation, which we refer to as the Perpendicular-Cutdepth method. This method involves cutting realworld depth maps along perpendicular directions and pasting them onto input images, thereby diversifying the data without compromising edge features. To validate the effectiveness of the algorithm, we compared it with existing convolutional neural network (CNN) against the current mainstream data augmentation algorithms. Additionally, to verify the algorithm's applicability to Transformer networks, we designed an encoder-decoder network structure based on Transformer to assess the generalization of our proposed algorithm. Experimental results demonstrate that, in the field of monocular depth estimation, our proposed method, Perpendicular-Cutdepth, outperforms traditional data augmentationmethods. On the indoor dataset NYU, our method increases accuracy from0.900 to 0.907 and reduces the error rate from0.357 to 0.351. On the outdoor dataset KITTI, our method improves accuracy from 0.9638 to 0.9642 and decreases the error rate from 0.060 to 0.0598. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 15462218
- Volume :
- 79
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Computers, Materials & Continua
- Publication Type :
- Academic Journal
- Accession number :
- 176916279
- Full Text :
- https://doi.org/10.32604/cmc.2024.048889