Job Search and Career Advice Platform

Activez les alertes d’offres d’emploi par e-mail !

Post-doc (M/F): Explanations of AI Systems via Causal Absraction

CNRS

France

Sur place

EUR 35 000 - 45 000

Plein temps

Aujourd’hui
Soyez parmi les premiers à postuler

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Résumé du poste

A leading research institution in France is seeking a post-doctoral researcher to contribute to AI interpretability research. The ideal candidate will have a PhD in machine learning or a related field and strong programming skills in Python. Responsibilities include developing evaluation metrics, running experiments, and collaborating with a dynamic research team. The position offers access to GPU computing infrastructure and participation in an active local AI community.

Prestations

Access to GPU computing infrastructure
Collaboration with PhD students and international partners

Qualifications

  • Strong command of deep learning and an interest in interpretability.
  • Experience in causal modeling, representation learning, or mechanistic interpretability is appreciated.
  • Ability to conduct independent research and collaborate within a research team.

Responsabilités

  • Contribute to causal abstraction research and evaluate AI interpretability.
  • Implement algorithms and experimental pipelines in Python/PyTorch.
  • Collaborate with the PI and PhD students on research projects.

Connaissances

Deep learning
Python programming
Causal modeling
NLP
Scientific communication

Formation

PhD in machine learning, NLP, causality, or related discipline

Outils

PyTorch
GPU clusters
Description du poste

Organisation/Company CNRS Department Laboratoire d'Informatique de Grenoble Research Field Engineering Computer science Mathematics Researcher Profile First Stage Researcher (R1) Country France Application Deadline 12 Dec 2025 - 23:59 (UTC) Type of Contract Temporary Job Status Full-time Hours Per Week 35 Offer Starting Date 1 Feb 2026 Is the job funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to staff position within a Research Infrastructure? No

Offer Description

The post-doctoral researcher will contribute to the causal abstraction research direction, which aims to build rigorous benchmark for evaluating AI interpretability using the framework of causal abstraction and develop new interpretability methods. Their mission is to advance the theoretical foundations of the project, develop the evaluation metrics, help establish a robust evaluation pipeline for interpretability methods, and build new interpretability algorithms.

The post-doc will carry out theoretical work on causal abstraction and causal alignment, implement algorithms and experimental pipelines in Python/PyTorch, and run experiments on GPU clusters. They will collaborate closely with the PI and the PhD students of the team, interact with international partners, and participate in the supervision and coordination of Master's interns involved in the project. Regular preparation of research results, contribution to conference submissions, and participation in project meetings will be part of their activities.

The post-doc will join CNRS in the GetAlp team at the Laboratoire d'Informatique de Grenoble (LIG). GetAlp conducts research in NLP, machine learning, evaluation, and interpretability. The project will be supervised by Maxime Peyrard (CNRS), with collaboration from PhD students and external partners. The researcher will benefit from an active local community in AI and access to GPU computing infrastructure.

The position requires a PhD in machine learning, NLP, causality, or a related discipline, with a strong command of deep learning and an interest in interpretability. Excellent programming skills in Python, familiarity with modern neural architectures, and the ability to conduct independent research are expected. Experience in causal modeling, representation learning, or mechanistic interpretability is appreciated. The successful candidate should also demonstrate good scientific communication skills and the ability to collaborate within a research team.

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.