Activez les alertes d’offres d’emploi par e-mail !

Experienced Data Engineer - Streaming Platform

-

Paris

Sur place

EUR 45 000 - 75 000

Plein temps

Il y a 24 jours

Mulipliez les invitations à des entretiens

Créez un CV sur mesure et personnalisé en fonction du poste pour multiplier vos chances.

Résumé du poste

Une entreprise innovante recherche un ingénieur en données passionné par les systèmes de streaming en temps réel. Dans ce rôle, vous serez responsable de la construction et de l'optimisation de pipelines de données pour traiter des millions d'événements. Vous collaborerez avec une équipe d'experts pour intégrer des solutions avancées de monétisation publicitaire. Ce poste offre une occasion unique de contribuer à des projets stratégiques qui ont un impact direct sur l'entreprise. Si vous êtes motivé par la technologie et souhaitez faire partie d'une équipe dynamique, cette opportunité est faite pour vous.

Prestations

Salaire compétitif selon expérience
Package de relocalisation complet
Chèques déjeuner Swile
Gymlib (100% pris en charge)
Couverture santé premium SideCare
Garderies pour enfants
Activités de bien-être au bureau
Politique de vacances illimitées
Jours de télétravail

Qualifications

  • 3-5+ ans d'expérience en ingénierie des données avec un accent sur les systèmes de streaming.
  • Solides compétences en programmation avec Java, Scala ou Python dans des systèmes distribués.

Responsabilités

  • Construire et optimiser des pipelines de données en temps réel pour traiter les demandes d'enchères.
  • Collaborer avec les ingénieurs backend pour intégrer des signaux OpenRTB dans nos pipelines de données.

Connaissances

Ingénierie des données
Systèmes de streaming en temps réel
Java
Scala
Python
Gestion des schémas d'événements
Kubernetes
CI/CD

Outils

Apache Flink
Spark Structured Streaming
GCP Pub/Sub
AWS Kinesis
Apache Pulsar
Kafka
Terraform
Docker
Helm

Description du poste

Founded in 2013, Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees, 7 billion downloads, and over 200 million active users, Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes chart-topping games like Mob Control and Block Jam, alongside popular apps such as BeReal and Wizz.

Team

The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry.

Within the Data team, you'll join the Ad-Network Team which is an autonomous squad of around 30 people. The team is composed of top-tier software engineers, infrastructure engineers, data engineers, mobile engineers, and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners, and relies on advanced technological solutions to optimize advertising in a real-time bidding environment. It is a strategic topic with significant impact on the business.

This role requires to be onsite 3 days/week and is Paris based.

Role

  • Build, maintain, and optimize real-time data pipelines to process bid requests, impressions, clicks, and user engagement data.
  • Develop scalable solutions using tools like Apache Flink, Spark Structured Streaming, or similar stream processing frameworks.
  • Collaborate with backend engineers to integrate OpenRTB signals into our data pipelines and ensure smooth data flow across systems.
  • Ensure data pipelines handle high-throughput, low-latency, and fault-tolerant processing in real-time.
  • Write clean, well-documented code in Java, Scala, or Python for distributed systems.
  • Work with cloud-native messaging and event platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka to ensure reliable message delivery.
  • Assist in the management and evolution of event schemas (Protobuf, Avro), including data consistency and versioning.
  • Implement monitoring, logging, and alerting for streaming workloads to ensure data integrity and system health.
  • Continuously improve data infrastructure for better performance, cost-efficiency, and scalability.

Profile (Must have)
  • 3-5+ years of experience in data engineering, with a strong focus on real-time streaming systems.
  • Familiarity with stream processing tools like Apache Flink, Spark Structured Streaming, Beam, or similar frameworks.
  • Solid programming experience in Java, Scala, or Python, especially in distributed or event-driven systems.
  • Experience working with event streaming and messaging platforms like GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka.
  • Hands-on knowledge of event schema management, including tools like Avro or Protobuf.
  • Understanding of real-time data pipelines, with experience handling large volumes of event-driven data.
  • Comfortable working in Kubernetes for deploying and managing data processing workloads in cloud environments (AWS, GCP, etc.).
  • Exposure to CI/CD workflows and infrastructure-as-code tools such as Terraform, Docker, and Helm.

Nice to have
  • Familiarity with real-time analytics platforms (e.g., ClickHouse, Pinot, Druid) for querying large volumes of event data.
  • Exposure to service mesh, auto-scaling, or cost optimization strategies in containerized environments.
  • Contributions to open-source projects related to data engineering or stream processing.

Benefits
  • Competitive salary upon experience
  • Comprehensive relocation package (including visa support)
  • Swile Lunch voucher
  • Gymlib (100% borne by Voodoo)
  • Premium healthcare coverage SideCare, for your family is 100% borne by Voodoo
  • Child day care facilities (Les Petits Chaperons rouges)
  • Wellness activities in our Paris office
  • Unlimited vacation policy
  • Remote days
Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.