Back to Search Start Over

Divide & Bind Your Attention for Improved Generative Semantic Nursing

Authors :
Li, Yumeng
Keuper, Margret
Zhang, Dan
Khoreva, Anna
Publication Year :
2023

Abstract

Emerging large-scale text-to-image generative models, e.g., Stable Diffusion (SD), have exhibited overwhelming results with high fidelity. Despite the magnificent progress, current state-of-the-art models still struggle to generate images fully adhering to the input prompt. Prior work, Attend & Excite, has introduced the concept of Generative Semantic Nursing (GSN), aiming to optimize cross-attention during inference time to better incorporate the semantics. It demonstrates promising results in generating simple prompts, e.g., "a cat and a dog". However, its efficacy declines when dealing with more complex prompts, and it does not explicitly address the problem of improper attribute binding. To address the challenges posed by complex prompts or scenarios involving multiple entities and to achieve improved attribute binding, we propose Divide & Bind. We introduce two novel loss objectives for GSN: a novel attendance loss and a binding loss. Our approach stands out in its ability to faithfully synthesize desired objects with improved attribute alignment from complex prompts and exhibits superior performance across multiple evaluation benchmarks.<br />Comment: Accepted at BMVC 2023 as Oral. Code: https://github.com/boschresearch/Divide-and-Bind and project page: https://sites.google.com/view/divide-and-bind

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2307.10864
Document Type :
Working Paper