Self-Supervised Graph Attention Networks for Deep Weighted Multi-View Clustering

Authors

  • Zongmo Huang University of Electronic Science and Technology of China
  • Yazhou Ren University of Electronic Science and Technology of China Shenzhen Institute for Advanced Study, University of Electronic Science and Technology of China
  • Xiaorong Pu University of Electronic Science and Technology of China Shenzhen Institute for Advanced Study, University of Electronic Science and Technology of China
  • Shudong Huang Sichuan University
  • Zenglin Xu Harbin Institute of Technology, Shenzhen
  • Lifang He Lehigh University

DOI:

https://doi.org/10.1609/aaai.v37i7.25960

Keywords:

ML: Multi-Instance/Multi-View Learning, ML: Clustering

Abstract

As one of the most important research topics in the unsupervised learning field, Multi-View Clustering (MVC) has been widely studied in the past decade and numerous MVC methods have been developed. Among these methods, the recently emerged Graph Neural Networks (GNN) shine a light on modeling both topological structure and node attributes in the form of graphs, to guide unified embedding learning and clustering. However, the effectiveness of existing GNN-based MVC methods is still limited due to the insufficient consideration in utilizing the self-supervised information and graph information, which can be reflected from the following two aspects: 1) most of these models merely use the self-supervised information to guide the feature learning and fail to realize that such information can be also applied in graph learning and sample weighting; 2) the usage of graph information is generally limited to the feature aggregation in these models, yet it also provides valuable evidence in detecting noisy samples. To this end, in this paper we propose Self-Supervised Graph Attention Networks for Deep Weighted Multi-View Clustering (SGDMC), which promotes the performance of GNN-based deep MVC models by making full use of the self-supervised information and graph information. Specifically, a novel attention-allocating approach that considers both the similarity of node attributes and the self-supervised information is developed to comprehensively evaluate the relevance among different nodes. Meanwhile, to alleviate the negative impact caused by noisy samples and the discrepancy of cluster structures, we further design a sample-weighting strategy based on the attention graph as well as the discrepancy between the global pseudo-labels and the local cluster assignment. Experimental results on multiple real-world datasets demonstrate the effectiveness of our method over existing approaches.

Downloads

Published

2023-06-26

How to Cite

Huang, Z., Ren, Y., Pu, X., Huang, S., Xu, Z., & He, L. (2023). Self-Supervised Graph Attention Networks for Deep Weighted Multi-View Clustering. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 7936-7943. https://doi.org/10.1609/aaai.v37i7.25960

Issue

Section

AAAI Technical Track on Machine Learning II