Back to Search Start Over

Generating Person Images with Appearance-aware Pose Stylizer

Authors :
Huang, Siyu
Xiong, Haoyi
Cheng, Zhi-Qi
Wang, Qingzhong
Zhou, Xingran
Wen, Bihan
Huan, Jun
Dou, Dejing
Publication Year :
2020

Abstract

Generation of high-quality person images is challenging, due to the sophisticated entanglements among image factors, e.g., appearance, pose, foreground, background, local details, global structures, etc. In this paper, we present a novel end-to-end framework to generate realistic person images based on given person poses and appearances. The core of our framework is a novel generator called Appearance-aware Pose Stylizer (APS) which generates human images by coupling the target pose with the conditioned person appearance progressively. The framework is highly flexible and controllable by effectively decoupling various complex person image factors in the encoding phase, followed by re-coupling them in the decoding phase. In addition, we present a new normalization method named adaptive patch normalization, which enables region-specific normalization and shows a good performance when adopted in person image generation model. Experiments on two benchmark datasets show that our method is capable of generating visually appealing and realistic-looking results using arbitrary image and pose inputs.<br />Comment: Appearing at IJCAI 2020. The code is available at https://github.com/siyuhuang/PoseStylizer

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2007.09077
Document Type :
Working Paper