Back to Search Start Over

Editable Neural Radiance Fields Convert 2D to 3D Furniture Texture

Authors :
Chaoyi Tan
Chenghao Wang
Zheng Lin
Shuyao He
Chao Li
Chaoyi Tan
Chenghao Wang
Zheng Lin
Shuyao He
Chao Li
Source :
International Journal of Engineering and Management Research; Vol. 14 No. 3 (2024): June Issue; 62-65; 2250-0758; 2394-6962
Publication Year :
2024

Abstract

Our work presents a neural network designed to convert textual descriptions into 3D models. By leveraging the encoder-decoder architecture, we effectively combine text information with attributes such as shape, color, and position. This combined information is then input into a generator to predict new furniture objects, which are enriched with detailed information like color and shape.[1] The predicted furniture objects are subsequently processed by an encoder to extract feature information, which is then utilized in the loss function to propagate errors and update model weights. After training the network, we can generate new 3D objects solely based on textual input, showcasing the potential of our approach in generating customizable 3D models from descriptive text.[2]

Details

Database :
OAIster
Journal :
International Journal of Engineering and Management Research; Vol. 14 No. 3 (2024): June Issue; 62-65; 2250-0758; 2394-6962
Notes :
application/pdf, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1446544039
Document Type :
Electronic Resource