Back to Search Start Over

NIVeL: Neural Implicit Vector Layers for Text-to-Vector Generation

Authors :
Thamizharasan, Vikas
Liu, Difan
Fisher, Matthew
Zhao, Nanxuan
Kalogerakis, Evangelos
Lukac, Michal
Publication Year :
2024

Abstract

The success of denoising diffusion models in representing rich data distributions over 2D raster images has prompted research on extending them to other data representations, such as vector graphics. Unfortunately due to their variable structure and scarcity of vector training data, directly applying diffusion models on this domain remains a challenging problem. Using workarounds like optimization via Score Distillation Sampling (SDS) is also fraught with difficulty, as vector representations are non trivial to directly optimize and tend to result in implausible geometries such as redundant or self-intersecting shapes. NIVeL addresses these challenges by reinterpreting the problem on an alternative, intermediate domain which preserves the desirable properties of vector graphics -- mainly sparsity of representation and resolution-independence. This alternative domain is based on neural implicit fields expressed in a set of decomposable, editable layers. Based on our experiments, NIVeL produces text-to-vector graphics results of significantly better quality than the state-of-the-art.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.15217
Document Type :
Working Paper