Activez les alertes d’offres d’emploi par e-mail !

HEAD OF LOGISTIC OPERATIONS (H / F)

Criteo

Paris

Sur place

EUR 100 000 - 125 000

Plein temps

Il y a 30+ jours

Mulipliez les invitations à des entretiens

Créez un CV sur mesure et personnalisé en fonction du poste pour multiplier vos chances.

Résumé du poste

An innovative tech company is on the lookout for a talented Data Engineer to join their dynamic team. This role offers the chance to architect scalable Datalakes using Snowflake and cloud solutions, while also enhancing real-time data processing and supporting business intelligence initiatives. With a hybrid work model that allows for flexibility, you will collaborate on cutting-edge data projects that drive impactful analytics and scalable storage solutions. If you're eager to tackle technical challenges and contribute to a culture of innovation, this opportunity is perfect for you.

Qualifications

  • Strong expertise in Snowflake architecture and schema design.
  • Experience in designing scalable and secure data pipelines.

Responsabilités

  • Architect scalable Datalakes using Snowflake and cloud-based storage.
  • Build and optimise ETL/ELT pipelines for data ingestion.

Connaissances

Snowflake
Data Engineering
ETL/ELT Pipelines
Data Governance
Containerisation
Agile/Scrum

Outils

dbt
Airflow
Kafka
Apache Spark
Grafana
Prometheus
IAM & RBAC
Redshift
BigQuery
Docker

Description du poste

We're working with a fast-growing tech company looking for a Data Engineer with expertise in Snowflake and Datalake architectures to help optimise and scale their data infrastructure. You’ll be part of an experienced Data Team working on high-impact projects that drive business intelligence, analytics, and scalable storage solutions.

What You’ll Do :

  • Architect scalable Datalakes using Snowflake and cloud-based storage solutions.
  • Build and optimise ETL / ELT pipelines for efficient data ingestion and transformation.
  • Enhance real-time data processing with event-driven architectures.
  • Improve security and performance across data platforms.
  • Support business intelligence initiatives to maximise data insights.
  • Data Warehousing : Snowflake, Redshift, BigQuery
  • ETL Pipelines : dbt, Airflow, Kafka, Apache Spark
  • Monitoring & Security : Grafana, Prometheus, IAM & RBAC

What We’re Looking For :

  • Strong expertise in Snowflake (architecture, schema design, optimisation).
  • Experience designing scalable and secure data pipelines .
  • Familiarity with containerisation (Docker), CI / CD pipelines, and Agile / Scrum .
  • A structured, detail-oriented approach to data governance and best practices.
  • Curiosity and a proactive mindset —you enjoy tackling technical challenges.

Why Join?

  • Work on cutting-edge data projects with a team of skilled engineers.
  • Enjoy a hybrid work model with flexibility ( 3 days WFH ).
  • Be part of a company that values innovation, collaboration, and growth .

If you're looking for an exciting opportunity to shape the future of scalable data solutions, apply now!

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.