RHanDS: Refining Malformed Hands for Generated Images with Decoupled Structure and Style Guidance

Authors

  • Chengrui Wang Taobao & Tmall Group of Alibaba
  • Pengfei Liu Taobao & Tmall Group of Alibaba Xiamen University
  • Min Zhou Taobao & Tmall Group of Alibaba
  • Ming Zeng Xiamen University
  • Xubin Li Taobao & Tmall Group of Alibaba
  • Tiezheng Ge Taobao & Tmall Group of Alibaba
  • Bo Zheng Taobao & Tmall Group of Alibaba

DOI:

https://doi.org/10.1609/aaai.v39i7.32815

Abstract

Although diffusion models can generate high-quality human images, their applications are limited by the instability in generating hands with correct structures. In this paper, we introduce RHanDS, a conditional diffusion-based framework designed to refine malformed hands by utilizing decoupled structure and style guidance. The hand mesh reconstructed from the malformed hand offers structure guidance for correcting the structure of the hand, while the malformed hand itself provides style guidance for preserving the style of the hand. To alleviate the mutual interference between style and structure guidance, we introduce a two-stage training strategy and build a series of multi-style hand datasets. In the first stage, we use paired hand images for training to ensure stylistic consistency in hand refining. In the second stage, various hand images generated based on human meshes are used for training, enabling the model to gain control over the hand structure. Experimental results demonstrate that RHanDS can effectively refine hand structure while preserving consistency in hand style.

Downloads

Published

2025-04-11

How to Cite

Wang, C., Liu, P., Zhou, M., Zeng, M., Li, X., Ge, T., & Zheng, B. (2025). RHanDS: Refining Malformed Hands for Generated Images with Decoupled Structure and Style Guidance. Proceedings of the AAAI Conference on Artificial Intelligence, 39(7), 7573–7581. https://doi.org/10.1609/aaai.v39i7.32815

Issue

Section

AAAI Technical Track on Computer Vision VI