Dynamic Graph Representation for Occlusion Handling in Biometrics

Authors

  • Min Ren University of Chinese Academy of Sciences
  • Yunlong Wang CRIPAC NLPR CASIA
  • Zhenan Sun CRIPAC NLPR CASIA
  • Tieniu Tan CRIPAC NLPR CASIA

DOI:

https://doi.org/10.1609/aaai.v34i07.6869

Abstract

The generalization ability of Convolutional neural networks (CNNs) for biometrics drops greatly due to the adverse effects of various occlusions. To this end, we propose a novel unified framework integrated the merits of both CNNs and graphical models to learn dynamic graph representations for occlusion problems in biometrics, called Dynamic Graph Representation (DGR). Convolutional features onto certain regions are re-crafted by a graph generator to establish the connections among the spatial parts of biometrics and build Feature Graphs based on these node representations. Each node of Feature Graphs corresponds to a specific part of the input image and the edges express the spatial relationships between parts. By analyzing the similarities between the nodes, the framework is able to adaptively remove the nodes representing the occluded parts. During dynamic graph matching, we propose a novel strategy to measure the distances of both nodes and adjacent matrixes. In this way, the proposed method is more convincing than CNNs-based methods because the dynamic graph method implies a more illustrative and reasonable inference of the biometrics decision. Experiments conducted on iris and face demonstrate the superiority of the proposed framework, which boosts the accuracy of occluded biometrics recognition by a large margin comparing with baseline methods.

Downloads

Published

2020-04-03

How to Cite

Ren, M., Wang, Y., Sun, Z., & Tan, T. (2020). Dynamic Graph Representation for Occlusion Handling in Biometrics. Proceedings of the AAAI Conference on Artificial Intelligence, 34(07), 11940-11947. https://doi.org/10.1609/aaai.v34i07.6869

Issue

Section

AAAI Technical Track: Vision