TY - JOUR AU - Chen, Bo AU - Zhang, Jing AU - Zhang, Xiaokang AU - Tang, Xiaobin AU - cai, lingfan AU - Chen, Hong AU - Li, Cuiping AU - Zhang, Peng AU - Tang, Jie PY - 2022/06/28 Y2 - 2024/03/28 TI - CODE: Contrastive Pre-training with Adversarial Fine-Tuning for Zero-Shot Expert Linking JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 36 IS - 11 SE - AAAI Special Track on AI for Social Impact DO - 10.1609/aaai.v36i11.21441 UR - https://ojs.aaai.org/index.php/AAAI/article/view/21441 SP - 11846-11854 AB - Expert finding, a popular service provided by many online websites such as Expertise Finder, LinkedIn, and AMiner, is beneficial to seeking candidate qualifications, consultants, and collaborators. However, its quality is suffered from lack of ample sources of expert information. This paper employs AMiner as the basis with an aim at linking any external experts to the counterparts on AMiner. As it is infeasible to acquire sufficient linkages from arbitrary external sources, we explore the problem of zero-shot expert linking. In this paper, we propose CODE, which first pre-trains an expert linking model by contrastive learning on AMiner such that it can capture the representation and matching patterns of experts without supervised signals, then it is fine-tuned between AMinerand external sources to enhance the model’s transferability in an adversarial manner. For evaluation, we first design two intrinsic tasks, author identification and paper clustering, to validate the representation and matching capability endowed by contrastive learning. Then the final external expert linking performance on two genres of external sources also implies the superiority of adversarial fine-tuning method. Additionally, we show the online deployment of CODE, and continuously improve its online performance via active learning. ER -