| Location: | Durham |
|---|---|
| Salary: | £38,784 to £46,049 |
| Hours: | Full Time |
| Contract Type: | Fixed-Term/Contract |
| Placed On: | 10th April 2026 |
|---|---|
| Closes: | 19th April 2026 |
| Job Ref: | 26000369 |
The Role
The role focuses on advancing research in explainable and trustworthy machine learning, with a particular emphasis on mechanistic interpretability and its application to healthcare data. The successful candidate will contribute to understanding how modern machine learning models represent information internally and how their predictions can be made transparent, reliable, and clinically meaningful. This work will involve developing and applying interpretability techniques to machine learning systems used in healthcare contexts, including tasks such as risk prediction, clinical decision support, and the analysis of complex biomedical datasets. The research environment is highly interdisciplinary, bringing together expertise from machine learning, statistics, data science, and healthcare research.
The successful applicant will be expected to design and implement analytical and interpretability frameworks to investigate the internal representations and decision-making processes of machine learning models. This includes developing and applying techniques such as feature attribution, representation analysis, causal probing, and mechanistic circuit analysis. The postholder will also develop predictive models using modern deep learning frameworks (e.g., PyTorch) and evaluate them with a focus on interpretability, robustness, and real-world applicability in healthcare settings.
The role also involves developing reproducible research software and scalable data analysis pipelines in Python for machine learning and interpretability research. These tools will support systematic investigation of model behaviour and enable robust experimentation on large and complex datasets relevant to healthcare and biomedical research.
The postholder will be expected to contribute actively to high-quality research outputs, including peer-reviewed publications in leading machine learning and interdisciplinary venues, open-source software tools, and collaborative research initiatives. The successful candidate will also contribute to the preparation and development of research grant proposals, supporting the group�s strong track record of successful funding and helping to expand its research portfolio in explainable and trustworthy AI for healthcare.
Type / Role:
Subject Area(s):
Location(s):