Available
Project number:
2025_05
Start date:
October 2025
Project themes:
Main supervisor:
Professor of AI
Co-supervisor:
Additional Information:
Explainable AI for medical image classification: applications to brain imaging
Deep learning models for image classification are increasingly being explored for their potential applications in clinical settings. However, these models often operate as "black boxes," making it challenging to understand the reasoning behind specific classifications, even when their performance is exemplary. To address this limitation, Explainable AI (XAI) has emerged as a crucial field of research, offering tools that enhance the transparency of AI-driven classifications. We have developed a novel XAI tool, Causal Responsibility-based Explanations (ReX), which is being tailored to meet the unique requirements of various medical data types.
XAI is particularly important in the medical domain for two main reasons: compliance with the regulatory requirements of the EU AI Act and the establishment of trust with clinical stakeholders. Despite the significant need, XAI has not yet become a standard component in medical image classification. This gap is largely due to insufficient validation processes that meet the stringent demands of clinical research and the misalignment between the output formats preferred by clinicians and those generated by developers. By collaborating closely with our clinical partners from the outset, we aim to overcome these challenges and ensure that our XAI tool is both reliable and user-friendly for healthcare professionals.
The primary objectives of this project are threefold. First, we will expand ReX to accommodate various data types and models used in clinical applications, enabling explanations at both the pixel level for 2D images and the voxel level for 3D images. Second, we will refine ReX to ensure its effectiveness and reliability specifically for brain Imaging . Lastly, we aim to develop and implement methods to validate the explanations produced by ReX, ensuring they meet clinical standards and provide meaningful insights for medical professionals.
We are now accepting applications for 1 October 2025
How to apply
Candidates should possess or be expected to achieve a 1st or upper 2nd class degree in a relevant subject including the biosciences, computer science, mathematics, statistics, data science, chemistry, physics, and be enthusiastic about combining their expertise with other disciplines in the field of healthcare.
Important information for International Students:
It is the responsibility of the student to apply for their Student Visa. Please note that the EPSRC DRIVE-Health studentship does not cover the visa application fees or the Immigration Health Surcharge (IHS) required for access to the National Health Service. The IHS is mandatory for anyone entering the UK on a Student Visa and is currently £776 per year for each year of study. Further detail can be found under the International Students tab below.
Next Steps
- Applications submitted by the closing date of Thursday 6 February 2025 will be considered by the CDT. We will contact shortlisted applicants with information about this part of the recruitment process.
- Candidates will be invited to attend an interview. Interviews are projected to take place in April 2025.
- Project selection will be through a panel interview chaired by either Professor Richard Dobson and Professor Vasa Curcin (CDT Directors) followed by informal discussion with prospective supervisors.
- If you have any questions related to the specific project you are applying for, please contact the main supervisor of the project directly.
For any other questions about the recruitment process, please email us at