Back to Search Start Over

FlexEdit: Flexible and Controllable Diffusion-based Object-centric Image Editing

Authors :
Nguyen, Trong-Tung
Nguyen, Duc-Anh
Tran, Anh
Pham, Cuong
Publication Year :
2024

Abstract

Our work addresses limitations seen in previous approaches for object-centric editing problems, such as unrealistic results due to shape discrepancies and limited control in object replacement or insertion. To this end, we introduce FlexEdit, a flexible and controllable editing framework for objects where we iteratively adjust latents at each denoising step using our FlexEdit block. Initially, we optimize latents at test time to align with specified object constraints. Then, our framework employs an adaptive mask, automatically extracted during denoising, to protect the background while seamlessly blending new content into the target image. We demonstrate the versatility of FlexEdit in various object editing tasks and curate an evaluation test suite with samples from both real and synthetic images, along with novel evaluation metrics designed for object-centric editing. We conduct extensive experiments on different editing scenarios, demonstrating the superiority of our editing framework over recent advanced text-guided image editing methods. Our project page is published at https://flex-edit.github.io/.<br />Comment: Our project page: https://flex-edit.github.io/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.18605
Document Type :
Working Paper