ContactGen: Contact-Guided Interactive 3D Human Generation for Partners

Authors

  • Dongjun Gu UNIST
  • Jaehyeok Shim UNIST
  • Jaehoon Jang UNIST
  • Changwoo Kang UNIST
  • Kyungdon Joo UNIST

DOI:

https://doi.org/10.1609/aaai.v38i3.27962

Keywords:

CV: 3D Computer Vision, ML: Deep Generative Models & Autoencoders

Abstract

Among various interactions between humans, such as eye contact and gestures, physical interactions by contact can act as an essential moment in understanding human behaviors. Inspired by this fact, given a 3D partner human with the desired interaction label, we introduce a new task of 3D human generation in terms of physical contact. Unlike previous works of interacting with static objects or scenes, a given partner human can have diverse poses and different contact regions according to the type of interaction. To handle this challenge, we propose a novel method of generating interactive 3D humans for a given partner human based on a guided diffusion framework (ContactGen in short). Specifically, we newly present a contact prediction module that adaptively estimates potential contact regions between two input humans according to the interaction label. Using the estimated potential contact regions as complementary guidances, we dynamically enforce ContactGen to generate interactive 3D humans for a given partner human within a guided diffusion model. We demonstrate ContactGen on the CHI3D dataset, where our method generates physically plausible and diverse poses compared to comparison methods.

Published

2024-03-24

How to Cite

Gu, D., Shim, J., Jang, J., Kang, C., & Joo, K. (2024). ContactGen: Contact-Guided Interactive 3D Human Generation for Partners. Proceedings of the AAAI Conference on Artificial Intelligence, 38(3), 1923-1931. https://doi.org/10.1609/aaai.v38i3.27962

Issue

Section

AAAI Technical Track on Computer Vision II