Helmholtz Gemeinschaft

Search
Browse
Statistics
Feeds

Model guidance via explanations turns image classifiers into segmentation models

Item Type:Conference or Workshop Item
Title:Model guidance via explanations turns image classifiers into segmentation models
Creators Name:Yu, X., Franzen, J., Samek, W., Höhne, M.M.C. and Kainmueller, D.
Abstract:Heatmaps generated on inputs of image classification networks via explainable AI methods like Grad-CAM and LRP have been observed to resemble segmentations of input images in many cases. Consequently, heatmaps have also been leveraged for achieving weakly supervised segmentation with image-level supervision. On the other hand, losses can be imposed on differentiable heatmaps, which has been shown to serve for (1) improving heatmaps to be more human-interpretable, (2) regularization of networks towards better generalization, (3) training diverse ensembles of networks, and (4) for explicitly ignoring confounding input features. Due to the latter use case, the paradigm of imposing losses on heatmaps is often referred to as “Right for the right reasons”. We unify these two lines of research by investigating semi-supervised segmentation as a novel use case for the Right for the Right Reasons paradigm. First, we show formal parallels between differentiable heatmap architectures and standard encoder-decoder architectures for image segmentation. Second, we show that such differentiable heatmap architectures yield competitive results when trained with standard segmentation losses. Third, we show that such architectures allow for training with weak supervision in the form of image-level labels and small numbers of pixel-level labels, outperforming comparable encoder-decoder models.
Keywords:Few-Shot Learning, Semantic Segmentation, Layer-Wise, Relevance Propagation
Source:Communications in Computer and Information Science
Title of Book:Explainable Artificial Intelligence
ISSN:1865-0929
ISBN:978-3-031-63796-4
Publisher:Springer
Page Range:113-129
Number of Pages:17
Date:10 July 2024
Official Publication:https://doi.org/10.1007/978-3-031-63797-1_7

Repository Staff Only: item control page

Open Access
MDC Library