SHAMSUL

Launch App Logo

Application

Name
SHAMSUL
Type
Gradio App
URL
https://shamsul.serve.scilifelab.se
Source Code
https://github.com/anondo1969/SHAMSUL
Image
mahbub1969/shamsul:v6
Created
09 Jan, 2025
Updated
19 Feb, 2025
Tags
"chest x-ray", "clinical decision support system", "deep learning", "heatmap score visualization", "interpretability methods", grad-cam, lime, lrp, shap

Explores interpretability for chest X-ray pathology predictions using methods—Grad-CAM, LIME, SHAP, and LRP. It provides heatmaps and evaluation metrics for better insights into the medical significance of predictions made by deep learning models.

Software

Type
Cloud Application
Operating System
Kubernetes
Version
ghcr.io/scilifelabdatacentre/serve-charts/custom-app:1.1.1

Resource

CPU Request
500m
CPU Limit
4000m
Memory Request
0.5Gi
Memory Limit
8Gi
Storage Request
200Mi
Storage Limit
5000Mi

Project

Name
Mahbub's_Apps
Created
13 Dec, 2024

The interpretability of deep neural networks has become a subject of great interest within the medical and healthcare domain. This attention stems from concerns regarding transparency, legal and ethical considerations, and the medical significance of predictions generated by these deep neural networks in clinical decision support systems. To address this matter, our study delves into the application of four well-established interpretability methods: Local Interpretable Model-agnostic Explanations (LIME), Shapley Additive exPlanations (SHAP), Gradient-weighted Class Activation Mapping (Grad-CAM), and Layer-wise Relevance Propagation (LRP). Leveraging the approach of transfer learning with a multi-label-multi-class chest radiography dataset, we aim to interpret predictions pertaining to specific pathology classes. Our analysis encompasses both single-label and multi-label predictions, providing a comprehensive and unbiased assessment through quantitative and qualitative investigations, which are compared against human expert annotation. Notably, Grad-CAM demonstrates the most favorable performance in quantitative evaluation, while the LIME heatmap score segmentation visualization exhibits the highest level of medical significance. Our research underscores both the outcomes and the challenges faced in the holistic approach adopted for assessing these interpretability methods and suggests that a multimodal-based approach, incorporating diverse sources of information beyond chest radiography images, could offer additional insights for enhancing interpretability in the medical domain.

Owner
Mahbub Ul Alam
mahbub.ul.alam@igp.uu.se
Department of Immunology, Genetics and Pathology
Uppsala universitet (Uppsala University)