NoiseGrad — Enhancing Explanations by Introducing Stochasticity to Model Weights
DOI:
https://doi.org/10.1609/aaai.v36i6.20561Keywords:
Machine Learning (ML)Abstract
Many efforts have been made for revealing the decision-making process of black-box learning machines such as deep neural networks, resulting in useful local and global explanation methods. For local explanation, stochasticity is known to help: a simple method, called SmoothGrad, has improved the visual quality of gradient-based attribution by adding noise to the input space and averaging the explanations of the noisy inputs. In this paper, we extend this idea and propose NoiseGrad that enhances both local and global explanation methods. Specifically, NoiseGrad introduces stochasticity in the weight parameter space, such that the decision boundary is perturbed. NoiseGrad is expected to enhance the local explanation, similarly to SmoothGrad, due to the dual relationship between the input perturbation and the decision boundary perturbation. We evaluate NoiseGrad and its fusion with SmoothGrad - FusionGrad - qualitatively and quantitatively with several evaluation criteria, and show that our novel approach significantly outperforms the baseline methods. Both NoiseGrad and FusionGrad are method-agnostic and as handy as SmoothGrad using a simple heuristic for the choice of the hyperparameter setting without the need of fine-tuning.Downloads
Published
2022-06-28
How to Cite
Bykov, K., Hedström, A., Nakajima, S., & Höhne, M. M.-C. (2022). NoiseGrad — Enhancing Explanations by Introducing Stochasticity to Model Weights. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6132-6140. https://doi.org/10.1609/aaai.v36i6.20561
Issue
Section
AAAI Technical Track on Machine Learning I