ProCC: Progressive Cross-Primitive Compatibility for Open-World Compositional Zero-Shot Learning

Authors

  • Fushuo Huo Department of Computing, The Hong Kong Polytechnic University, Hong Kong SAR
  • Wenchao Xu Department of Computing, The Hong Kong Polytechnic University, Hong Kong SAR
  • Song Guo Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR
  • Jingcai Guo Department of Computing, The Hong Kong Polytechnic University, Hong Kong SAR The Hong Kong Polytechnic University Shenzhen Research Institute, Shenzhen, China
  • Haozhao Wang School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan, China
  • Ziming Liu Department of Computing, The Hong Kong Polytechnic University, Hong Kong SAR
  • Xiaocheng Lu Department of Computer Science and Engineering, The Hong Kong University of Science and Technology, Hong Kong SAR

DOI:

https://doi.org/10.1609/aaai.v38i11.29164

Keywords:

ML: Transfer, Domain Adaptation, Multi-Task Learning, ML: Multimodal Learning

Abstract

Open-World Compositional Zero-shot Learning (OW-CZSL) aims to recognize novel compositions of state and object primitives in images with no priors on the compositional space, which induces a tremendously large output space containing all possible state-object compositions. Existing works either learn the joint compositional state-object embedding or predict simple primitives with separate classifiers. However, the former method heavily relies on external word embedding methods, and the latter ignores the interactions of interdependent primitives, respectively. In this paper, we revisit the primitive prediction approach and propose a novel method, termed Progressive Cross-primitive Compatibility (ProCC), to mimic the human learning process for OW-CZSL tasks. Specifically, the cross-primitive compatibility module explicitly learns to model the interactions of state and object features with the trainable memory units, which efficiently acquires cross-primitive visual attention to reason high-feasibility compositions, without the aid of external knowledge. Moreover, to alleviate the invalid cross-primitive interactions, especially for partial-supervision conditions (pCZSL), we design a progressive training paradigm to optimize the primitive classifiers conditioned on pre-trained features in an easy-to-hard manner. Extensive experiments on three widely used benchmark datasets demonstrate that our method outperforms other representative methods on both OW-CZSL and pCZSL settings by large margins.

Published

2024-03-24

How to Cite

Huo, F., Xu, W., Guo, S., Guo, J., Wang, H., Liu, Z., & Lu, X. (2024). ProCC: Progressive Cross-Primitive Compatibility for Open-World Compositional Zero-Shot Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(11), 12689-12697. https://doi.org/10.1609/aaai.v38i11.29164

Issue

Section

AAAI Technical Track on Machine Learning II