CODE: Contrastive Pre-training with Adversarial Fine-Tuning for Zero-Shot Expert Linking
Keywords:AI For Social Impact (AISI Track Papers Only)
AbstractExpert finding, a popular service provided by many online websites such as Expertise Finder, LinkedIn, and AMiner, is beneficial to seeking candidate qualifications, consultants, and collaborators. However, its quality is suffered from lack of ample sources of expert information. This paper employs AMiner as the basis with an aim at linking any external experts to the counterparts on AMiner. As it is infeasible to acquire sufficient linkages from arbitrary external sources, we explore the problem of zero-shot expert linking. In this paper, we propose CODE, which first pre-trains an expert linking model by contrastive learning on AMiner such that it can capture the representation and matching patterns of experts without supervised signals, then it is fine-tuned between AMinerand external sources to enhance the model’s transferability in an adversarial manner. For evaluation, we first design two intrinsic tasks, author identification and paper clustering, to validate the representation and matching capability endowed by contrastive learning. Then the final external expert linking performance on two genres of external sources also implies the superiority of adversarial fine-tuning method. Additionally, we show the online deployment of CODE, and continuously improve its online performance via active learning.
How to Cite
Chen, B., Zhang, J., Zhang, X., Tang, X., cai, lingfan, Chen, H., Li, C., Zhang, P., & Tang, J. (2022). CODE: Contrastive Pre-training with Adversarial Fine-Tuning for Zero-Shot Expert Linking. Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 11846-11854. https://doi.org/10.1609/aaai.v36i11.21441
AAAI Special Track on AI for Social Impact