Job Search and Career Advice Platform

Activez les alertes d’offres d’emploi par e-mail !

Data Pipeline Engineer

Uni Systems

Strasbourg

Sur place

EUR 55 000 - 75 000

Plein temps

Il y a 3 jours
Soyez parmi les premiers à postuler

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Résumé du poste

A technology company in France is seeking a skilled Data Pipeline Engineer to design and maintain data connectors and implement data pipelines. The ideal candidate will have a Master's degree and at least 8 years of experience with data integration tools like Airbyte. You'll be responsible for ensuring reliable data flows, managing batch and streaming pipelines, and utilizing frameworks including Kubernetes. This position requires solid SQL skills and familiarity with cloud environments.

Qualifications

  • At least 8 years of professional experience in IT.
  • Strong experience with data pipelines and relevant ETL/ELT frameworks.
  • Familiarity with cloud environments and data warehouses.

Responsabilités

  • Design and maintain data connectors between systems and the data warehouse.
  • Build and manage batch and streaming data pipelines.
  • Implement monitoring and alerting for data pipelines.

Connaissances

Data pipelines
ETL/ELT architectures
SQL
Kubernetes
Python
Open-source tools (Airbyte, Kafka Connect)

Formation

Master degree in IT or related field

Outils

Airbyte
Kafka Connect
dbt
Description du poste

At Uni Systems, we are working towards turning digital visions into reality. We are continuously growing and we are looking for a Data Pipeline Engineer to join our UniQue team.

What will you be doing in this role?

  • Design, implement, and maintain data connectors between source systems and the data warehouse.
  • Configure and operate open-source data integration tools (e.g. Airbyte or similar).
  • Ensure reliable, observable, and fault-tolerant data pipelines.
  • Build and manage batch and/or streaming data pipelines.
  • Handle data extraction, loading, and basic transformation in alignment with warehouse models defined by the Data Architect.
  • Implement monitoring, logging, and alerting for pipelines.
  • Define and implement data access patterns to the data warehouse.

What will you be bringing to the team?

  • Master degree in IT or related field and at least 8 years of professional experience in IT.
  • Strong experience with data pipelines and ETL/ELT architectures ideally with OpenShift relevant Frameworks.
  • Experience with running data pipelines with Kubernetes.
  • Hands‑on experience with open‑source data integration tools (e.g. Airbyte, Kafka Connect, Singer, etc.)
  • Solid knowledge of SQL and relational databases (PostgreSQL or similar).
  • Experience working with data warehouses (cloud or on‑prem).
  • Familiarity with Python for pipeline development and automation.
  • Understanding of data access control, authentication, and authorization.
  • Familiarity with dbt or similar transformation tools.
  • Experience in cloud environments.
  • Fluent in English at least at a level B2.

At Uni Systems, we are providing equal employment opportunities and banning any form of discrimination on grounds of gender, religion, race, color, nationality, disability, social class, political beliefs, age, marital status, sexual orientation or any other characteristics. Take a look at our Diversity, Equality & Inclusion Policy for more information.

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.