
Activez les alertes d’offres d’emploi par e-mail !
Générez un CV personnalisé en quelques minutes
Décrochez un entretien et gagnez plus. En savoir plus
A leading research institute in Saclay is offering an internship for a Master’s student focused on improving AI collaboration through understanding reasoning errors. The role involves studying fuzzy logic and cognitive biases, with a strong emphasis on programming skills in C#. Ideal candidates will have a keen interest in scientific research and AI technologies.
Stage
This internship aims to improve human‑AI collaboration, where accuracy alone is insufficient as trust requires understanding the AI's reasoning. Explainable AI (XAI) focuses on generating clear, faithful, concise, and user‑adapted explanations, since a poor explanation can harm cooperation.
The work concerns ExpressIF®, a symbolic fuzzy‑logic AI from CEA. Though interpretable, it may be hard to select the most relevant rules for users.
The project addresses human‑AI disagreement: the AI must detect possible human reasoning errors (lack of knowledge, confirmation or attention bias, illusory correlation, or uncertainty differences) to adapt its explanations. Goals: study fuzzy logic, ExpressIF®, and cognitive biases; design a model of user reasoning errors; integrate and evaluate it within ExpressIF®.
Located in Saclay, in the southern Île‑de‑France region, CEA LIST is a scientific and technological research centre dedicated to the development of software, embedded systems, and sensors for applications in defence, security, energy, nuclear, environment, and healthcare.
CEA LIST has more than 700 researchers focusing on intelligent digital systems, with core areas including artificial intelligence, the factory of the future, innovative instrumentation, cyber‑physical systems, and digital health.
Within the institute, the laboratories of the Digital Instrumentation Department work on developing and transferring cutting‑edge AI technologies to industry. The technical scope of our engineers and researchers covers signal analysis (i.e., time series as well as spectra) produced by equipment developed internally by CEA teams or by external companies.
The exploitation of these data relies on a broad range of machine learning methods, including both numerical AI (deep neural networks, random forests, SVMs) and symbolic AI (knowledge‑based systems).
You will join a team of five permanent researchers working on the ExpressIF platform, a fully in‑house developed symbolic artificial intelligence system designed to be interpretable. The goal is not to build a “black box” AI—whose reasoning process cannot be understood by humans—that simply provides decisions to an operator, but rather an AI that can be understood: experts can therefore extract insights for their own research, correct knowledge if it has been misled by data, or simply enrich it when certain elements are missing.
We are looking for a second‑year Master’s student or third‑year engineering school student specialising in Computer Science and/or Mathematics, with the following profile :
Saclay
Gif‑sur‑Yvette
Bac+5 - Master 2
Oui