Enable job alerts via email!

Research Associate in AI Verification

The University of Manchester

Manchester

On-site

GBP 30,000 - 40,000

Full time

Yesterday
Be an early applicant

Job summary

A leading research university in Manchester is seeking a post-doctoral research associate to work on the 'Hardware-Level AI Safety Verification' project. This role involves collaborating with experts to verify AI models at the hardware level, focusing on safety and implementation issues. The ideal candidate will have a strong background in formal methods and machine learning. Excellent benefits include a leading pension scheme and generous leave entitlement.

Benefits

Opportunity to work on cutting-edge projects
Collaboration with leading experts
Market leading pension scheme
Excellent health and wellbeing services
Exceptional annual leave
Paid closure over Christmas
Discounts at major retailers

Qualifications

  • Strong focus on AI safety and neural network verification.
  • Ability to collaborate in an interdisciplinary setting.
  • Experience in algorithm development and verification tools.

Responsibilities

  • Work on AI systems verification at the hardware level.
  • Collaborate with academics and interns across universities.
  • Contribute to publication and intellectual property development.

Skills

Formal methods
Machine learning
Control theory
Numerical analysis

Education

PhD in a relevant discipline

Job description

This 18-months appointment forms part of the project "Hardware-Level AI Safety Verification", funded by the Advanced Research + Invention Agency (ARIA) in partnership with the University of Manchester and the University of Birmingham. The project belongs to the Mathematics for Safe AI funding stream, which aims at assessing how we can leverage mathematics - from scientific world-models to mathematical proofs - to ensure that powerful AI systems interact safely and as intended with real-world systems and populations.

The project "Hardware-Level AI Safety Verification" will address a fundamental semantic mismatch between the formal guarantees produced by neural network verification tools and the actual implementation of neural networks at the hardware level. Specifically, hardware-level effects such as quantisation and sampling are often ignored during the verification of AI models. Yet, they are pervasive phenomena in any engineering application where digital compute platforms interact with the physical world. Their impact on the behaviour of neural network controllers and other AI models acting in a physical environment is not well understood.

The project is a collaborative effort, with academics, post-docs and interns collaborating across universities to build better algorithms, software tools and benchmarks to assess the safety of AI implementations at the software and hardware level. We are recruiting an enthusiastic and collaborative post-doctoral research associate with expertise in formal methods, machine learning, control theory, numerical analysis, or a related discipline, with a strong focus on AI safety and neural network verification. The post holder is expected to work closely with the two principal academic investigators in Manchester and Birmingham.

The role will be based in the Department of Computer Science at the University of Manchester. The department is one of the oldest department of Computer Science in the United Kingdom, and hosts around 60 academic staff. The role will be associated with the Systems and Software Security research group, but interactions with other relevant research groups (Autonomy and Verification, Formal Methods, Machine Learning and Robotics) are expected.

What we offer:

  • Opportunity to work on a cutting-edge project with real-world impact
  • Collaboration with leading international experts in AI verification
  • Potential for publication and intellectual property development
  • Fantastic market leading pension scheme
  • Excellent employee health and wellbeing services
  • Exceptional starting annual leave entitlement, plus bank holidays
  • Additional paid closure over the Christmas period
  • Local and national discounts at a range of major retailers
As an equal-opportunities employer we support an inclusive working environment and welcome applicants from all sections of the community regardless of age, disability, ethnicity, gender, gender expression, religion or belief, sex, sexual orientation and transgender status. All appointments are made on merit

Our University is positive about flexible working - you can find out more here

Blended working arrangements may be considered

Please note that we are unable to respond to enquiries, accept CVs or applications from Recruitment Agencies.

Enquiries about the vacancy, shortlisting and interviews:

Name: Dr. Edoardo Manino, Lecturer in AI Security

Email: edoardo.manino@manchester.ac.uk

General enquiries:

Email: People.Recruitment@manchester.ac.uk

Technical support:

Jobtrain: 0161 850 2004 https://jobseekersupport.jobtrain.co.uk/support/home

This vacancy will close for applications at midnight on the closing date.

Please see the link below for the Further Particulars document which contains the person specification criteria.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.