Activez les alertes d’offres d’emploi par e-mail !

Senior Data Platform Engineer

Coders Connect

Lyon

Sur place

EUR 60 000 - 85 000

Plein temps

Il y a 25 jours

Résumé du poste

A cutting-edge data tech company in Lyon is seeking a Senior Data Platform Engineer to enhance their core data infrastructure. This role involves optimizing APIs, managing high-throughput ingestion pipelines, and collaborating closely with a small agile team. Candidates should have 5–7+ years of experience with data platforms, and expertise in Kafka, ClickHouse, and Kubernetes. The position offers competitive salary, hybrid flexibility, and direct access to founders.

Prestations

Competitive salary plus equity
Direct access to founders
Hybrid work flexibility

Qualifications

  • 5–7+ years of experience with production data platforms, streaming, databases, and real-time systems.
  • Deep hands-on expertise with Kafka, ClickHouse, Kubernetes, PySpark, and cloud data lakes.
  • Proven track record building scalable, reliable data pipelines and APIs.

Responsabilités

  • Own and evolve the platform backbone: Kafka streaming, S3 data lakes, and PySpark processing.
  • Build and optimize APIs for data analytics and multi-tenant access.
  • Manage ingestion pipelines with high throughput and maintain schema controls.

Connaissances

Kafka
ClickHouse
Kubernetes
PySpark
Data pipelines

Outils

Typesense
S3
Description du poste

Job Description

Coders Connect is partnering with a cutting-edge data tech company that operates one of the largest real-time social data collection and processing platforms globally — handling tens of millions of posts daily across thousands of nodes worldwide. They serve top-tier clients in AI, cyber defense, and investment through powerful data and analytics APIs.

As a Senior Data Platform Engineer, you’ll be the guardian of their core data infrastructure, focusing on enhancing reliability, scalability, and cost-efficiency. You’ll work hands-on with streaming, storage, APIs, and big data pipelines to build the next generation of “data-as-a-service” products.

What You’ll Do :

  • Own and evolve the platform backbone: Kafka streaming, S3 data lakes, PySpark processing, and OLAP via ClickHouse
  • Build and optimize APIs for data analytics, search, and multi-tenant access
  • Manage ingestion pipelines with high throughput (millions of events per hour) and maintain schema and data quality controls
  • Drive operational excellence through monitoring, autoscaling, and disaster recovery plans
  • Collaborate closely with a small, agile team reporting directly to founders — quick decision-making and high autonomy

Requirements

  • 5–7+ years of experience with production data platforms, streaming, databases, and real-time systems
  • Deep hands-on expertise with Kafka, ClickHouse, Kubernetes, PySpark, Typesense (or similar full-text search), and S3 or other cloud data lakes
  • Proven track record building scalable, reliable data pipelines and data / analytics APIs in a fast-moving environment
  • Strong product mindset focused on cost-efficiency, automation, and API-first delivery
  • Comfortable working in a small, highly autonomous team with responsibility and freedom to influence tech and culture

Benefits

Why This Role?

  • Be a key player in a trailblazing company that processes massive social data streams daily
  • Work in Lyon with hybrid flexibility
  • Competitive salary plus equity (BSPCE)
  • Direct access to founders and leadership — your ideas and work will have immediate impact
  • Shape the future of data platforms and analytics-as-a-service

Ready to step up and own the data platform? Apply today!

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.