Location: | Manchester |
---|---|
Salary: | £37,694 to £46,049 per annum, depending on relevant experience |
Hours: | Full Time |
Contract Type: | Fixed-Term/Contract |
Placed On: | 4th August 2025 |
---|---|
Closes: | 14th August 2025 |
Job Ref: | SAE-029413 |
This 18-months appointment forms part of the project “Hardware-Level AI Safety Verification”, funded by the Advanced Research + Invention Agency (ARIA) in partnership with the University of Manchester and the University of Birmingham. The project belongs to the Mathematics for Safe AI funding stream, which aims at assessing how we can leverage mathematics – from scientific world-models to mathematical proofs – to ensure that powerful AI systems interact safely and as intended with real-world systems and populations.
The project “Hardware-Level AI Safety Verification” will address a fundamental semantic mismatch between the formal guarantees produced by neural network verification tools and the actual implementation of neural networks at the hardware level. Specifically, hardware-level effects such as quantisation and sampling are often ignored during the verification of AI models. Yet, they are pervasive phenomena in any engineering application where digital compute platforms interact with the physical world. Their impact on the behaviour of neural network controllers and other AI models acting in a physical environment is not well understood.
The project is a collaborative effort, with academics, post-docs and interns collaborating across universities to build better algorithms, software tools and benchmarks to assess the safety of AI implementations at the software and hardware level. We are recruiting an enthusiastic and collaborative post-doctoral research associate with expertise in formal methods, machine learning, control theory, numerical analysis, or a related discipline, with a strong focus on AI safety and neural network verification. The post holder is expected to work closely with the two principal academic investigators in Manchester and Birmingham.
The role will be based in the Department of Computer Science at the University of Manchester. The department is one of the oldest department of Computer Science in the United Kingdom, and hosts around 60 academic staff. The role will be associated with the Systems and Software Security research group, but interactions with other relevant research groups (Autonomy and Verification, Formal Methods, Machine Learning and Robotics) are expected.
What we offer:
As an equal-opportunities employer we support an inclusive working environment and welcome applicants from all sections of the community regardless of age, disability, ethnicity, gender, gender expression, religion or belief, sex, sexual orientation and transgender status. All appointments are made on merit.
Our University is positive about flexible working – you can find out more here
Blended working arrangements may be considered
Please note that we are unable to respond to enquiries, accept CVs or applications from Recruitment Agencies.
Enquiries about the vacancy, shortlisting and interviews:
Name: Dr. Edoardo Manino, Lecturer in AI Security
Email: edoardo.manino@manchester.ac.uk
General enquiries:
Email: People.Recruitment@manchester.ac.uk
Technical support:
Jobtrain: 0161 850 2004 jobseekersupport.jobtrain.co.uk/support/home
This vacancy will close for applications at midnight on the closing date.
Type / Role:
Subject Area(s):
Location(s):