Understanding Individual Agent Importance in Multi-Agent System via Counterfactual Reasoning
DOI:
https://doi.org/10.1609/aaai.v39i15.33733Abstract
Explaining multi-agent systems (MAS) is urgent as these systems become increasingly prevalent in various applications. Previous work has provided explanations for the actions or states of agents, yet falls short in understanding the blackboxed agent’s importance within a MAS and the overall team strategy. To bridge this gap, we propose EMAI, a novel agent-level explanation approach that evaluates the individual agent’s importance. Inspired by counterfactual reasoning, a larger change in reward caused by the randomized action of agent indicates its higher importance. We model it as a MARL problem to capture interactions across agents. Utilizing counterfactual reasoning, EMAI learns the masking agents to identify important agents. Specifically, we define the optimization function to minimize the reward difference before and after action randomization and introduce sparsity constraints to encourage the exploration of more action randomization of agents during training. The experimental results in seven multi-agent tasks demonstrate that EMAI achieves higher fidelity in explanations compared to baselines and provides more effective guidance in practical applications concerning understanding policies, launching attacks, and patching policies.Published
2025-04-11
How to Cite
Chen, J., Wang, Y., Wang, J., Xie, X., Hu, J., Wang, Q., & Xu, F. (2025). Understanding Individual Agent Importance in Multi-Agent System via Counterfactual Reasoning. Proceedings of the AAAI Conference on Artificial Intelligence, 39(15), 15785–15794. https://doi.org/10.1609/aaai.v39i15.33733
Issue
Section
AAAI Technical Track on Machine Learning I