TY - JOUR AU - Peres da Silva, Rafael AU - Suphavilai, Chayaporn AU - Nagarajan, Niranjan PY - 2021/05/18 Y2 - 2024/03/28 TI - Task Uncertainty Loss Reduce Negative Transfer in Asymmetric Multi-task Feature Learning (Student Abstract) JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 35 IS - 18 SE - AAAI Student Abstract and Poster Program DO - 10.1609/aaai.v35i18.17930 UR - https://ojs.aaai.org/index.php/AAAI/article/view/17930 SP - 15867-15868 AB - Multi-task learning (MTL) is frequently used in settings where a target task has to be learnt based on limited training data, but knowledge can be leveraged from related auxiliary tasks. While MTL can improve task performance overall relative to single-task learning (STL), these improvements can hide negative transfer (NT), where STL may deliver better performance for many individual tasks. Asymmetric multi-task feature learning (AMTFL) is an approach that tries to address this by allowing tasks with higher loss values to have smaller influence on feature representations for learning other tasks. Task loss values do not necessarily indicate reliability of models for a specific task. We present examples of NT in two orthogonal datasets (image recognition and pharmacogenomics) and tackle this challenge by using aleatoric homoscedastic uncertainty to capture the relative confidence between tasks, and set weights for task loss. Our results show that this approach reduces NT providing a new approach to enable robust MTL. ER -