Job Search and Career Advice Platform

Activez les alertes d’offres d’emploi par e-mail !

Data Engineer sénior

Collective

Paris

Sur place

EUR 60 000 - 80 000

Plein temps

Il y a 23 jours

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Résumé du poste

A leading tech company in Paris seeks a Senior Data Engineer to join their data engineering team. The role emphasizes the implementation of Data Lakehouse architecture and best practices in building data pipelines. Candidates must possess strong Python and Airflow skills. This position offers the chance to engage in transforming the company's data platform through innovative engineering solutions.

Qualifications

  • Hands-on experience in implementing Data Lakehouse architecture.
  • Advanced Python skills are essential.
  • Experience with Airflow is required.

Responsabilités

  • Provide high-quality code aligned with software engineering best practices.
  • Create data collection, ingestion, and validation pipelines.
  • Sync with DataOps team for CI/CD pipeline enhancements.

Connaissances

Data Lakehouse architecture
Python
Airflow
SQL
AWS services
Data Mesh principles
Data Governance
Data Quality principles

Outils

Apache Spark
Terraform
Snowflake
Apache Iceberg
Delta Lake
Description du poste
About the role :

We are looking for a result-driven Senior Data Engineer to join our core data engineering team . The successful candidate should have in-depth knowledge in data engineering with the ability to share experience with the team. The company has recently started the Data Transformation Program involving broad-scale modernization of Data Architecture, Data Governance and Data Exposition. The ultimate goal for this role is to apply best engineering practices in building end-to-end data pipelines and support transformation of the core data platform according to Data Lakehouse approach.

Responsibilities :
  • Provide high quality code aligned with software engineering best practices
  • Share experience with the team; encourage creating guidelines / policies
  • Create data collection / ingestion / validation pipelines for various sources
  • Sync with DataOps team to leverage the required CI / CD pipelines and integrations
  • Work together with Data Architect to validate the target data architecture design
  • Lead POC's
  • Help the team to refactor the existing orchestration design in Airflow
Required skills & experience :
  • Hands on experience in implementation Data Lakehouse architecture
  • Advanced hands on Python skills is a must
  • Airflow is a must
  • Spark - nice to have
  • Experience with Apache Iceberg / Delta Lake- nice to have
  • Hands on experience with AWS services
  • Advanced SQL
  • Deep understanding of Data modelling; knowledge of medallion architecture is a plus
  • Terraform - nice to have
  • Good understanding of Data Mesh principles
  • Snowflake experience is nice to have
  • Good understanding of Data Governance & Data Quality principles
Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.