Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Engineer French speaking (mf)

Ultra Tendency

Magdeburg

Hybrid

EUR 65.000 - 85.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

An international data engineering consultancy is looking for a Senior Data Engineer in Magdeburg, Germany. You will design and optimize data processing algorithms, lead the development of data workflows, and collaborate with various stakeholders. A deep experience with Databricks and solid programming skills in Python or Scala are essential. The role offers flexibility in work arrangements and numerous professional development opportunities. Candidates should be fluent in German, French, and English.

Leistungen

Flexible work options
Professional development opportunities
Educational resources such as paid certifications

Qualifikationen

  • Experience with batch and streaming data workflows.
  • Familiarity with orchestration tools like Airflow.
  • Strong skills in performance tuning in cloud-native environments.

Aufgaben

  • Design and maintain scalable data pipelines using Databricks.
  • Optimize Databricks clusters for performance and cost efficiency.
  • Lead development of data workflows for machine learning.

Kenntnisse

Deep hands-on experience with Databricks
Strong programming skills in Python or Scala
Professional German, French & English communication skills (C1)
Solid understanding of distributed computing

Tools

Databricks
Apache Spark Delta Lake
GitHub Actions
Terraform
Jobbeschreibung

Our Engineering community is growing and we are now looking for a Senior Data Engineer – French speaking (m/f/*) to join our team in Germany supporting our global growth.

As Senior Data Engineer (m/f/*) you design and optimize data processing algorithms on a talented cross‑functional team. You are familiar with the Apache open‑source suite of technologies and want to contribute to the advancement of data engineering.

WHAT WE OFFER
  • Flexible work options including fully remote or hybrid arrangements (candidates must be located in Germany)
  • A chance to accelerate your career and work with outstanding colleagues in a supportive learning community split across 3 continents
  • Contribute your ideas to our unique projects and make an impact by turning them into reality
  • Balance your work and personal life through our workflow organization and decide yourself if you work at home, in the office or in a hybrid setup
  • Annual performance review and regular feedback cycles generating distinct value by connecting colleagues through networks rather than hierarchies
  • Individual development plan and professional development opportunities
  • Educational resources such as paid certifications, unlimited access to Udemy Business, etc.
  • Local virtual and global team events in which UT colleagues become acquainted with one another
WHAT YOU'LL DO
  • Design, implement and maintain scalable data pipelines using Databricks Lakehouse Platform with a strong focus on Apache Spark Delta Lake and Unity Catalog.
  • Lead the development of batch and streaming data workflows that power analytics, machine learning and business intelligence use cases.
  • Collaborate with data scientists, architects and business stakeholders to translate complex data requirements into robust production‑grade solutions.
  • Optimize performance and cost‑efficiency of Databricks clusters and jobs leveraging tools like Photon Auto Loader and Job Workflows.
  • Establish and enforce best practices for data quality, governance and security within the Databricks environment.
  • Mentor junior engineers and contribute to the evolution of the team’s Databricks expertise.
WHAT YOU'LL BRING
  • Deep hands‑on experience with Databricks on Azure, AWS or GCP including Spark (PySpark/Scala), Delta Lake and MLflow.
  • Strong programming skills in Python or Scala and experience with CI/CD pipelines (e.g. GitHub Actions, Azure DevOps).
  • Solid understanding of distributed computing, data modeling and performance tuning in cloud‑native environments.
  • Familiarity with orchestration tools (e.g. Databricks Workflows, Airflow) and infrastructure‑as‑code (e.g. Terraform).
  • A proactive mindset, strong communication skills and a passion for building scalable, reliable data systems.
  • Professional German, French & English communication skills (C1‑level written and spoken).

Did we pique your interest or do you have any questions?

We want to hear from you: contact us at

ABOUT US

Ultra Tendency is an international premier Data Engineering consultancy for Big Data Cloud Streaming, IoT and Microservices. We design, build and operate large‑scale data‑driven applications for major enterprises such as the European Central Bank, HUK‑Coburg, Deutsche Telekom and Europe’s largest car manufacturer. Founded in Germany in 2010, UT has developed a reliable client base and now runs 8 branches in 7 countries across 3 continents.

We do more than just leverage tech; we build it. At Ultra Tendency we contribute source code to 20 open‑source projects including Ansible, Terraform, NiFi and Kafka. Our impact on tech and business is there for anyone to see. Enterprises seek out Ultra Tendency because we solve the problems others cannot.

We evaluate your application based on your skills and corresponding business requirements. Ultra Tendency welcomes applications from qualified candidates regardless of race, ethnicity, national or social origin, disability, sex, sexual orientation or age.

Data privacy statement: Data Protection for Applicants Ultra Tendency

Senior IC

Key Skills: Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Employment Type: Full Time
Experience: years
Vacancy: 1

Senior Data Engineer • Magdeburg, Saxony‑Anhalt, Germany

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.