Job Search and Career Advice Platform

Enable job alerts via email!

Researcher (AI Interpretability Project)

University of Oxford

Oxford

On-site

GBP 60,000 - 80,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading educational institution is seeking a motivated Researcher to contribute to an AI Interpretability project. This role involves delivering research and collaborating with researchers and students on various experiments and publications. Candidates should be adept at both independent and team-oriented research in a high-expectation environment. The position is full-time and fixed-term for 6 months, emphasizing equal opportunity and diversity in applicants.

Qualifications

  • Motivated by rigorous and ambitious research.
  • Ability to transition between conceptual reasoning and rapid implementation.
  • Enjoys working collaboratively in a high-expectation environment.

Responsibilities

  • Deliver research and support tool and experiment development.
  • Collaborate with researchers and students across workstreams.
  • Assist in developing funding proposals and reports.

Skills

Independent research
Collaboration
Technical support
Funding proposal development
Research documentation
Job description

We have an exciting opportunity for a Researcher to work on the AI Interpretability research project. Reporting to the Principal Investigator, the post holder will be a key contributor to the research group, responsible for delivering research and supporting the development of shared tools, experiments, and publications. The role involves both independent research and collaboration with researchers and students across related workstreams. The post holder may provide technical support to colleagues and will have the opportunity to assist in the development of funding proposals, progress reports, and related documentation for ongoing and future grants in the research area.

The successful candidate will join the Interpretability research project, which develops methods for analysing, explaining, and intervening in the internal representations and computations of large-scale AI systems. The project explores areas including automated circuit discovery, structured capability hypotheses from natural language queries, model‑internal debugging, interpretability‑driven model control, and scalable evaluation of mechanistic explanations. The team collaborates with multiple academic and industry partners across AI safety, interpretability, and governance.

The ideal candidate will be motivated by rigorous and ambitious research, able to transition between conceptual reasoning and rapid implementation, and enjoy working collaboratively within an interdisciplinary and high‑expectation research environment.

The post is full time and fixed term for 6 months.

The Oxford Martin School supports the University’s commitment to equal opportunity, and to being a place where everyone belongs and is supported to succeed. We recognise how the diversity of our community enriches our ability to deliver on our academic mission.

We welcome applications from individuals from all backgrounds, including those under‑represented within higher education. No applicant or members of staff shall be unlawfully discriminated against on the basis of age, disability, gender reassignment, marriage or civil partnership, pregnancy or maternity, race, religion or belief, sex, or sexual orientation.

You will be required to upload your CV and a supporting statement as part of your online application. Your supporting statement should list each of the essential and desirable selection criteria, as listed in the job description, and explain how you meet each one. Both documents must be submitted to be considered.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.