Location: | Newcastle upon Tyne |
---|---|
Salary: | £40,247 to £45,162 |
Hours: | Full Time |
Contract Type: | Permanent |
Placed On: | 2nd October 2024 |
---|---|
Closes: | 4th November 2024 |
Job Ref: | 2773 |
About the role
We are seeking an innovative and forward-thinking Postdoctoral Research Associate with a background in Human-Computer Interaction and a strong interest in Responsible AI to join our interdisciplinary team to investigate and explore Probabilistic AI in Law Enforcement Futures. We are particularly interested in candidates with expertise in Design Fiction and Speculative Design methodologies.
PROBabLE Futures – Probabilistic Systems in Law Enforcement Futures is a recently funded Responsible AI UK (RAi UK) Keystone Project that brings together academics, Law Enforcement, Government, Third Sector and Commercial AI Industry partners to explore responsible approaches to Probabilistic AI across the law enforcement landscape. This 4-year project is led by Northumbria University and involves a multi-disciplinary team of researchers from Glasgow, Northampton, Leicester, Cambridge and Aberdeen Universities. The project involves engagement with primary stakeholders to analyse and map the Probabilistic AI ecosystem in law enforcement. We will investigate prior law enforcement technology case studies and explore and anticipate future law enforcement AI. We will design and evaluate novel interfaces and systems for AI-assisted decision-making in law enforcement and use innovative research techniques, such as a mock trial involving AI outputs as evidence, with the overall aim of developing a future-oriented framework for responsible AI in law enforcement and criminal justice.
The candidate will be based in the Northumbria Social Computing (NorSC) research group with the Department of Computer and Information Sciences and will work collaboratively with postdoctoral researchers from law and machine learning as well as our law enforcement, third sector, and commercial partners.
This role is a fixed term role until 31st March 2028. Interviews will be held on week commencing 18th November 2024.
About the team
This Research Project is being delivered within the Department of Computer and Information Sciences within the research team of the PROBabLE Futures Keystone Project. The project involves working alongside our law enforcement, third sector and commercial partners, to develop a framework to understand the implications of uncertainty and to build confidence in future Probabilistic AI in law enforcement, with the interests of justice and responsibility at its heart. The project is a UKRI-funded Responsible AI UK Keystone project and includes academic Co-Is in a number of disciplines, law enforcement and commercial partners.
About you
Applicants should hold a PhD (or equivalent experience) in Human-Computer Interaction or a relevant discipline and have demonstrable specialist expert knowledge in Interaction Design. A strong understanding of AI explainability and fairness, and technical, regulatory and ethical issues raised by the use of AI in law enforcement and the public sector is desirable.
Further information about the requirements of the role is available in the person specification.
Type / Role:
Subject Area(s):
Location(s):