Deconstructed Generation-Based Zero-Shot Model
DOI:
https://doi.org/10.1609/aaai.v37i1.25102Keywords:
CV: Language and Vision, ML: Transfer, Domain Adaptation, Multi-Task LearningAbstract
Recent research on Generalized Zero-Shot Learning (GZSL) has focused primarily on generation-based methods. However, current literature has overlooked the fundamental principles of these methods and has made limited progress in a complex manner. In this paper, we aim to deconstruct the generator-classifier framework and provide guidance for its improvement and extension. We begin by breaking down the generator-learned unseen class distribution into class-level and instance-level distributions. Through our analysis of the role of these two types of distributions in solving the GZSL problem, we generalize the focus of the generation-based approach, emphasizing the importance of (i) attribute generalization in generator learning and (ii) independent classifier learning with partially biased data. We present a simple method based on this analysis that outperforms SotAs on four public GZSL datasets, demonstrating the validity of our deconstruction. Furthermore, our proposed method remains effective even without a generative model, representing a step towards simplifying the generator-classifier structure. Our code is available at https://github.com/cdb342/DGZ.Downloads
Published
2023-06-26
How to Cite
Chen, D., Shen, Y., Zhang, H., & Torr, P. H. (2023). Deconstructed Generation-Based Zero-Shot Model. Proceedings of the AAAI Conference on Artificial Intelligence, 37(1), 295-303. https://doi.org/10.1609/aaai.v37i1.25102
Issue
Section
AAAI Technical Track on Computer Vision I