Activez les alertes d’offres d’emploi par e-mail !

Statistical Computing Platform Engineer

Blackfluo.ai

Paris

Sur place

EUR 60 000 - 80 000

Plein temps

Aujourd’hui
Soyez parmi les premiers à postuler

Résumé du poste

A leading data science company in France is seeking a Statistical Computing Platform Engineer to design and manage collaborative statistical environments. The ideal candidate will have extensive experience with tools like JupyterHub and GitLab, along with strong DevOps skills. This role focuses on enabling transparent, scalable statistical collaboration among teams. Candidates should hold a relevant degree and possess strong communication abilities.

Qualifications

  • 6+ years experience managing shared environments for data science.
  • Experience with DevOps tools like Docker and Kubernetes.
  • Proficiency in Python, R, and Stata in production.
  • Ability to translate workflow needs into platform features.

Responsabilités

  • Design and deploy secure environments for statistical work.
  • Automate provisioning of shared notebooks and computational backends.
  • Support real-time collaboration across distributed teams.
  • Manage user access and ensure compliance with data protection policies.

Connaissances

Managing shared environments for data science
DevOps practices and tools
Supporting statistical programming languages
Version control and collaborative code workflows
Strong communication and documentation skills

Formation

Bachelors or Masters degree in Computer Science, Statistics, or Data Science

Outils

Docker
Kubernetes
GitLab CI/CD
Terraform
Description du poste
About the job Statistical Computing Platform Engineer

Statistical Computing Platform Engineer Building collaborative environments for data science and statistical programming

Position Overview We are seeking a Statistical Computing Platform Engineer to design, deploy, and manage shared environments for collaborative statistical analysis and algorithm development. This role will focus on platforms like JupyterHub and GitLab, enabling statisticians, economists, and data scientists to collaboratively write, run, version, and share statistical code using open standards and reproducible workflows.

The ideal candidate will have a background in data science infrastructure, DevOps for analytical environments, and a strong interest in enabling transparent, scalable statistical collaboration.

Platform Design & Deployment
  • Design and deploy secure, scalable environments for collaborative statistical work using JupyterHub, RStudio Server, and similar notebook-based tools
  • Integrate version control (e.g. GitLab, GitHub) and CI/CD pipelines into statistical workflows for peer review and reproducibility
  • Implement multi-user compute environments with isolated kernels, persistent storage, and resource quotas
Infrastructure & Automation
  • Automate the provisioning of shared notebooks, computational backends, and environments using Docker, Kubernetes, or Terraform
  • Maintain environments with pre-configured libraries for Python, R, and Stata, optimized for statistical work
  • Implement monitoring, logging, and performance tracking for usage and troubleshooting
Collaboration Enablement
  • Support integration of shared development workflows, code repositories, and notebook-sharing templates
  • Enable real-time and asynchronous collaboration on models, scripts, and results across distributed teams
  • Develop templates and best practices for reproducible analysis pipelines and peer-reviewed code
Security & Compliance
  • Manage user access, authentication (OAuth, LDAP, SSO), and secure execution of notebooks in shared environments
  • Ensure compliance with data protection policies and sandboxing of user workloads
Required Qualifications
Technical Skills
  • 6+ years experience managing shared environments for data science or statistical analysis (e.g. JupyterHub, RStudio Server, VSCode Server)
  • Proficiency with DevOps practices and tools (Docker, Kubernetes, GitLab CI/CD, Ansible, Terraform)
  • Experience supporting statistical programming languages (Python, R, Stata) in a production environment
  • Knowledge of version control, collaborative code workflows, and reproducible research practices
Soft Skills
  • Ability to work closely with statisticians, researchers, and data scientists to translate workflow needs into platform features
  • Strong communication and documentation skills
  • Passion for open science, transparency, and collaboration
Preferred Qualifications
  • Bachelors or Masters degree in Computer Science, Statistics, Data Science, or a related technical field
  • Experience in academic, governmental, or international research organizations
  • Familiarity with HPC environments or cloud-based statistical computing (e.g., GCP, AWS, Azure for research)
  • Background in open data workflows, FAIR principles, or statistical methodology
Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.