Location: | Cumbria, York |
---|---|
Salary: | £44,263 to £54,395 per year. |
Hours: | Full Time |
Contract Type: | Fixed-Term/Contract |
Placed On: | 25th April 2024 |
---|---|
Closes: | 16th May 2024 |
Job Ref: | 13317 |
Department
You will join an exciting new research centre in the Department of Computer Science at the University of York. The Centre for Assuring Autonomy (CfAA) is building on the work of the Assuring Autonomy International Programme (AAIP) which pioneered approaches to assuring autonomous systems and their machine learning (ML) components. The CfAA also contributes to the assurance pillar of the Institute for Safe Autonomy (ISA). You will be based at the University, in ISA, and spend time at the Robotics and AI Collaboration (RAICo) research facility in West Cumbria, giving you direct access to systems, engineers and researchers.
Role
You will be researching techniques for assuring the safety of ML used in safety-related applications in challenging environments. Building on previous research undertaken in the AAIP such as AMLAS (https://www.york.ac.uk/assuring-autonomy/guidance/amlas/), you will develop methods that lead to the creation of ML components that can be demonstrated to be sufficiently safe to deploy. You will also explore how assurance of ML can be sustained through-life as the systems evolve and the environment changes. This post provides a unique opportunity to develop and validate these techniques on real autonomous robotic systems being used for nuclear decommissioning and cleanup tasks. This role will require you to work closely and effectively with other team members including experienced engineers and researchers and be able to explain your research clearly and precisely to a range of different audiences.
In this role you will initially be seconded until March 2025 to work on real robotics projects at the RAICo research facility in Cumbria. Relocation expenses can be provided to support this where appropriate.
Skills, Experience & Qualifications Needed
You must have a first degree in Computer Science or cognate discipline and a PhD in computer science, autonomous systems, or equivalent experience. You should have knowledge of machine learning and ideally some knowledge of safety assurance and safety cases. Experience of developing machine learning models is desirable. You must have experience of undertaking high quality research and a proven ability to take responsibility for a research project.
Interview date: week commencing 3 June 2024.
For informal enquiries: please contact Dr. Richard Hawkins – richard.hawkins@york.ac.uk.
The University strives to be diverse and inclusive – a place where we can ALL be ourselves.
We particularly encourage applications from people who identify as Black, Asian or from a Minority Ethnic background, who are underrepresented at the University.
We also encourage applications from women for senior roles.
We offer family friendly, flexible working arrangements, with forums and inclusive facilities to support our staff. #EqualityatYork
Type / Role:
Subject Area(s):
Location(s):