Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Northumbria University is seeking a Postdoctoral Research Associate in Human-Centred AI. This role involves research on Probabilistic AI in law enforcement, focusing on innovative design methodologies and responsible AI practices. The ideal candidate holds a relevant PhD and has expertise in Interaction Design and ethical considerations in AI.
Social network you want to login/join with:
col-narrow-left
Northumbria University
Newcastle upon Tyne, United Kingdom
Other
-
Yes
col-narrow-right
6350c3cb4f09
5
02.06.2025
17.07.2025
col-wide
ABOUT THE ROLE
We are seeking an innovative Postdoctoral Research Associate with expertise in Human-Computer Interaction and a strong interest in Responsible AI to join our interdisciplinary team. The role involves investigating Probabilistic AI in Law Enforcement Futures, focusing on Design Fiction and Speculative Design methodologies.
PROBabLE Futures is a funded Responsible AI UK project exploring responsible approaches to Probabilistic AI in law enforcement. It involves collaboration with academics, law enforcement, government, third sector, and AI industry partners across the UK. The project aims to analyze and map the Probabilistic AI ecosystem, explore future AI scenarios, and develop interfaces and systems for AI-assisted decision-making, including innovative techniques like mock trials involving AI outputs.
The candidate will be based in the Northumbria Social Computing (NorSC) research group within the Department of Computer and Information Sciences, working with researchers from law and machine learning, as well as law enforcement and industry partners.
This fixed-term role is until 31 March 2028. Interviews will be held during the week commencing 18th November 2024.
ABOUT THE TEAMThis project is part of the Department of Computer and Information Sciences at Northumbria University, involving collaboration with law enforcement, third sector, and industry partners to develop a framework for responsible Probabilistic AI in law enforcement, emphasizing justice and responsibility. It is funded by UKRI and includes multidisciplinary academic Co-Is, law enforcement, and commercial partners.
ABOUT YOUApplicants should hold a PhD (or equivalent) in Human-Computer Interaction or a related field, with demonstrable expertise in Interaction Design. A solid understanding of AI explainability, fairness, and the ethical, regulatory issues related to AI in law enforcement and the public sector is desirable.