Back to Search Start Over

Unity Perception: Generate Synthetic Data for Computer Vision

Authors :
Borkman, Steve
Crespi, Adam
Dhakad, Saurav
Ganguly, Sujoy
Hogins, Jonathan
Jhang, You-Cyuan
Kamalzadeh, Mohsen
Li, Bowen
Leal, Steven
Parisi, Pete
Romero, Cesar
Smith, Wesley
Thaman, Alex
Warren, Samuel
Yadav, Nupur
Publication Year :
2021

Abstract

We introduce the Unity Perception package which aims to simplify and accelerate the process of generating synthetic datasets for computer vision tasks by offering an easy-to-use and highly customizable toolset. This open-source package extends the Unity Editor and engine components to generate perfectly annotated examples for several common computer vision tasks. Additionally, it offers an extensible Randomization framework that lets the user quickly construct and configure randomized simulation parameters in order to introduce variation into the generated datasets. We provide an overview of the provided tools and how they work, and demonstrate the value of the generated synthetic datasets by training a 2D object detection model. The model trained with mostly synthetic data outperforms the model trained using only real data.<br />Comment: We corrected tasks supported by NVISII platform. For the Unity perception package, see https://github.com/Unity-Technologies/com.unity.perception

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2107.04259
Document Type :
Working Paper