Job Search and Career Advice Platform

Activez les alertes d’offres d’emploi par e-mail !

Senior Data Engineer

Trades Workforce Solutions

Eu

Hybride

EUR 40 000 - 60 000

Plein temps

Aujourd’hui
Soyez parmi les premiers à postuler

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Résumé du poste

A leading data solution provider in Normandie, France is seeking a Senior Data Engineer to lead the migration of the data infrastructure to a new, self-hosted architecture. You will collaborate with a team to enhance data pipeline performance and reliability, using tools like Kafka, ClickHouse, and S3. The position offers a remote-friendly environment with flexible hours, excellent benefits, and support for your well-being, making it an exciting opportunity for driven professionals.

Prestations

Remote-friendly & Flexible working hours
Apple devices for work
Team buildings abroad
Health insurance coverage
WellBeing program
Unlimited time-off policy

Qualifications

  • Strong experience building, maintaining, and optimizing ETL/ELT pipelines.
  • Proficiency in Python and SQL for data processing and analytics.
  • Hands-on experience with Kafka, Trino, ClickHouse, Spark, or similar.
  • Experience with S3 or equivalent object storage solutions.
  • Proficiency with Docker, Kubernetes, Git, and CI/CD pipelines.
  • Familiarity with workflow orchestration tools like Airflow or dbt.
  • Ability to optimize system performance and troubleshoot complex data issues.
  • Self-starter mentality in a fast-paced environment.
  • Fluent English for team collaboration.

Responsabilités

  • Lead and execute the migration from Firebase-BigQuery-Looker to a self-hosted stack.
  • Design, develop, and optimize scalable, high-performance data pipelines.
  • Automate data processing and workflow orchestration.
  • Enhance reliability, scalability, and cost-efficiency of data infrastructure.
  • Collaborate to define best practices for data processing and storage.
  • Develop internal tools for monitoring data quality and lineage.
  • Optimize query performance and ensure efficient data modeling.

Connaissances

Data Engineering
Python
SQL
Kafka
Trino
ClickHouse
Spark
S3
Docker
Kubernetes
Git
CI/CD
Airflow
Description du poste
What We Offer:
  • A team of exceptional people who celebrate our community by being supportive and creative all the way. The head of data once was developing mind-reading helmet; a software developer has his certified psychological practice; QA-vet; QA-skipper; and a number of musicians that might allow us to start our own band. All together we’re multiplying each other’s talents which inspire us to develop a product we’re all proud of.
  • A product that promotes health and fitness in 100 countries. Sweatcoin has proven to help people and create multiple inspiring stories like this one: https://blog.sweatco.in/one-sweatcoiners-journey-to-100000-steps/.
  • A startup that actually works. We are completely self-sufficient yet our investors are excited to provide us with even more resources to keep growing. We recently broke our record of 10M new users each week.
  • Models that help us to verify steps so there is no cheat way a dog can earn coins for an owner.
  • Automatised A/B tests, analytics that are deeply integrated into the product, a modern data stack (jupiter, bigquery, airflow, looker).

What You Will Do: We are looking for a Senior Data Engineer to join our team and play a key role in migrating our data infrastructure to a new architecture. You will work alongside two other engineers on this large-scale migration and help shape the future of our data platform. Over time, our team will

evolve into a platform-focused group, building automation tools, improving performance, ensuring data pipeline resilience, and strengthening data governance.

What You Will Do:
  • Lead and execute the migration from Firebase-BigQuery-Looker to a self-hosted stack including Snowplow, Kafka, ClickHouse, Trino, Spark, S3, and Redash.
  • Design, develop, and optimise scalable, high-performance data pipelines.
  • Automate data processing and workflow orchestration.
  • Enhance data infrastructure reliability, scalability, and cost-efficiency.
  • Collaborate with engineers and analysts to define best practices for data processing, storage, and governance.
  • Develop internal tools for data quality monitoring, lineage tracking, and debugging.
  • Optimize query performance and ensure efficient data modeling.
What We Expect From You:
  • Expertise in Data Engineering: Strong experience building, maintaining, and optimizing ETL/ELT pipelines.
  • Strong Coding Skills: Proficiency in Python and SQL for data processing and analytics.
  • Distributed Systems Experience: Hands-on experience with Kafka, Trino, ClickHouse, Spark, or similar.
  • Cloud & Storage: Experience with S3 or equivalent object storage solutions.
  • Infrastructure & Tooling: Proficiency with Docker, Kubernetes, Git, and CI/CD pipelines.
  • Orchestration & Automation: Familiarity with workflow orchestration tools like Airflow or dbt.
  • Analytical Thinking: Ability to optimize system performance and troubleshoot complex data issues.
  • Self-Starter Mentality: Comfortable working in a fast-paced, evolving environment with minimal supervision.
  • Strong Communication Skills: Fluent English to ensure smooth collaboration within the team.
Nice To Have:
  • Experience with Snowplow for event tracking and data collection.
  • Knowledge of data governance and security best practices.
  • Familiarity with machine learning pipelines and real-time analytics.
What You Get In Return:
  • Remote-friendly & Flexible working hours. The flexibility is incredible, performance is based on output, rather than hours spent working, you can be wherever you want!
  • Apple devices for work.
  • Team buildings abroad in exciting locations!
  • Health insurance coverage.
  • WellBeing program, which supports up to 2 counselling sessions per month.
  • Unlimited time-off policy.

If you feel that that’s a match we would be excited to have you on our team!

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.