1. Semantic segmentation of textured mosaics.
- Author
-
Cote, Melissa, Dash, Amanda, and Branzan Albu, Alexandra
- Subjects
- *
TEXTURE analysis (Image processing) , *DATA augmentation , *DEEP learning - Abstract
This paper investigates deep learning (DL)-based semantic segmentation of textured mosaics. Existing popular datasets for mosaic texture segmentation, designed prior to the DL era, have several limitations: (1) training images are single-textured and thus differ from the multi-textured test images; (2) training and test textures are typically cut out from the same raw images, which may hinder model generalization; (3) each test image has its own limited set of training images, thus forcing an inefficient training of one model per test image from few data. We propose two texture segmentation datasets, based on the existing Outex and DTD datasets, that are suitable for training semantic segmentation networks and that address the above limitations: SemSegOutex focuses on materials acquired under controlled conditions, and SemSegDTD focuses on visual attributes of textures acquired in the wild. We also generate a synthetic version of SemSegOutex via texture synthesis that can be used in the same way as standard random data augmentation. Finally, we study the performance of the state-of-the-art DeepLabv3+ for textured mosaic segmentation, which is excellent for SemSegOutex and variable for SemSegDTD. Our datasets allow us to analyze results according to the type of material, visual attributes, various image acquisition artifacts, and natural versus synthetic aspects, yielding new insights into the possible usage of recent DL technologies for texture analysis. Article highlights: We propose two texture segmentation datasets that address the limitations of existing texture segmentation datasets. Experiments with materials and attributes shed a new light on recent deep learning technologies for texture analysis. Our results also suggest that synthetic textures can be used for data augmentation to improve segmentation results. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF