Viewpoint-Aware Loss with Angular Regularization for Person Re-Identification


  • Zhihui Zhu Sun Yat-sen University
  • Xinyang Jiang Tencent YouTu Lab
  • Feng Zheng Southern University of Science and Technology
  • Xiaowei Guo Tencent YouTu Lab
  • Feiyue Huang Tencent YouTu Lab
  • Xing Sun Tencent YouTu Lab
  • Weishi Zheng Sun Yat-sen University



Although great progress in supervised person re-identification (Re-ID) has been made recently, due to the viewpoint variation of a person, Re-ID remains a massive visual challenge. Most existing viewpoint-based person Re-ID methods project images from each viewpoint into separated and unrelated sub-feature spaces. They only model the identity-level distribution inside an individual viewpoint but ignore the underlying relationship between different viewpoints. To address this problem, we propose a novel approach, called Viewpoint-Aware Loss with Angular Regularization (VA-reID). Instead of one subspace for each viewpoint, our method projects the feature from different viewpoints into a unified hypersphere and effectively models the feature distribution on both the identity-level and the viewpoint-level. In addition, rather than modeling different viewpoints as hard labels used for conventional viewpoint classification, we introduce viewpoint-aware adaptive label smoothing regularization (VALSR) that assigns the adaptive soft label to feature representation. VALSR can effectively solve the ambiguity of the viewpoint cluster label assignment. Extensive experiments on the Market1501 and DukeMTMC-reID datasets demonstrated that our method outperforms the state-of-the-art supervised Re-ID methods.




How to Cite

Zhu, Z., Jiang, X., Zheng, F., Guo, X., Huang, F., Sun, X., & Zheng, W. (2020). Viewpoint-Aware Loss with Angular Regularization for Person Re-Identification. Proceedings of the AAAI Conference on Artificial Intelligence, 34(07), 13114-13121.



AAAI Technical Track: Vision