Available
Project number:
2025_94
Start date:
October 2025
Project themes:
Main supervisor:
Senior Lecturer in Computer Science
Co-supervisor:
Additional Information:
Enhancing Patient Privacy: Differential Privacy in Federated Learning for Healthcare
Federated learning (FL) is revolutionizing machine learning by enabling models to be trained on decentralized data while preserving privacy, making it particularly impactful in sensitive domains such as healthcare. Traditional FL methods, while privacy-preserving by design, still carry the risk of exposing individual data through model updates. Differential privacy (DP) offers a robust solution to this challenge by adding noise to the learning process, ensuring that individual patient data remains anonymous. This project combines federated learning with differential privacy to create a framework that can train high-quality healthcare models while maintaining strong privacy guarantees.
The novelty of this project lies in applying differential privacy to federated learning in the healthcare sector, where protecting patient data is crucial. By integrating DP, the project aims to reduce privacy risks without sacrificing model performance, pushing the boundaries of privacy-preserving machine learning in healthcare.
The project aims to develop federated learning algorithms with differential privacy, ensuring secure training on decentralized healthcare data. Objectives include adapting existing DP algorithms to FL frameworks, evaluating the trade-off between privacy and performance, and validating the approach on real-world healthcare datasets. By achieving these goals, the project will contribute to safer, more secure machine learning in healthcare applications, paving the way for wider adoption of AI technologies in this critical domain.
References:
Kairouz, P., McMahan, H. B., et al. (2019). Advances and Open Problems in Federated Learning.
Geyer, R. C., Klein, T., & Nabi, M. (2017). Differentially Private Federated Learning: A Client Level Perspective.
Federated Learning With Differential Privacy: Algorithms and Performance Analysis, available at arXiv.
We are now accepting applications for 1 October 2025
How to apply
Candidates should possess or be expected to achieve a 1st or upper 2nd class degree in a relevant subject including the biosciences, computer science, mathematics, statistics, data science, chemistry, physics, and be enthusiastic about combining their expertise with other disciplines in the field of healthcare.
Important information for International Students:
It is the responsibility of the student to apply for their Student Visa. Please note that the EPSRC DRIVE-Health studentship does not cover the visa application fees or the Immigration Health Surcharge (IHS) required for access to the National Health Service. The IHS is mandatory for anyone entering the UK on a Student Visa and is currently £776 per year for each year of study. Further detail can be found under the International Students tab below.
Next Steps
- Applications submitted by the closing date of Thursday 6 February 2025 will be considered by the CDT. We will contact shortlisted applicants with information about this part of the recruitment process.
- Candidates will be invited to attend an interview. Interviews are projected to take place in April 2025.
- Project selection will be through a panel interview chaired by either Professor Richard Dobson and Professor Vasa Curcin (CDT Directors) followed by informal discussion with prospective supervisors.
- If you have any questions related to the specific project you are applying for, please contact the main supervisor of the project directly.
For any other questions about the recruitment process, please email us at