Organisation Name
University of Tübingen, Hertie Institute for AI in Brain Health, Department of Data Science
Industry
Health services
Organisation Website
https://hertie.ai/data-science
Country
Germany
Sustainable Development Goals (SDGs)
SDG 3: Good Health and Well-being
SDG 9: Industry, Innovation and Infrastructure
SDG 10: Reduced Inequality
SDG 17: Partnerships to achieve the Goal
General Description of the AI tool
An inherently interpretable patch-based AI model for diabetic retinopathy that predicts disease presence and highlights suspicious or discriminative regions on retinal images. It extracts and presents these regions to clinicians, guiding decisions and supporting interpretation in clinical workflows.
Github, open data repository
https://github.com/kdjoumessi/Sparse-BagNet_clinical-validation
Relevant Research and Publications
1. Cynthia Rudin, 2019, “Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead”.
2. Djoumessi et al., 2023, “Sparse Activations for Interpretable Disease Grading”
3. Djoumessi et al., 2025, “An inherently interpretable AI model improves screening speed and accuracy for early diabetic retinopathy”
4. Gervelmeyer et al., 2024, “Interpretable-by-design deep survival analysis for disease progression modeling”.
5. Mensah et al., 2025, “Clinically Interpretable Deep Learning via Sparse BagNets for Epiretinal Membrane and Related Pathology Detection”
Needs
Funding
Customers
Public Exposure
R&D expertise
Mentorship Program