Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Engineer - French speaking (m/f/*)

Ultra Tendency

Magdeburg

Hybrid

EUR 60.000 - 80.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A premier Data Engineering consultancy is looking for a Senior Data Engineer - French speaking to design and optimize data processing algorithms. The role involves leading the development of data workflows and collaborating with cross-functional teams. Qualified candidates should have deep expertise in Databricks and strong programming skills in Python or Scala, along with professional communication skills in German, French, and English. This position offers flexible working arrangements and a chance to work on impactful projects.

Leistungen

Flexible work options including fully remote arrangements
Educational resources such as paid certifications
Annual performance reviews
Team events

Qualifikationen

  • Experience with Databricks on Azure, AWS, or GCP.
  • Hands-on experience with Delta Lake and MLflow.
  • Solid understanding of distributed computing and data modeling.

Aufgaben

  • Design, implement, and maintain scalable data pipelines.
  • Collaborate with data scientists and stakeholders.
  • Establish best practices for data quality and security.

Kenntnisse

Deep hands-on experience with Databricks
Strong programming skills in Python or Scala
Professional German, French & English communication skills (C1-level)

Tools

Databricks
Apache Spark
GitHub Actions
Terraform
Jobbeschreibung
Senior Data Engineer - French speaking (m/f/*)

Our Engineering community is growing, and we’re now looking for a Senior Data Engineer - French speaking (m/f/*) to join our team in Germany, supporting our global growth.

As Senior Data Engineer (m/f/*), you design and optimize data processing algorithms on a talented, cross‑functional team. You are familiar with the Apache open‑source suite of technologies and want to contribute to the advancement of data engineering.

WHAT WE OFFER
  • Flexible work options, including fully remote or hybrid arrangements (candidates must be located in Germany)
  • A chance to accelerate your career and work with outstanding colleagues in a supportive learning community split across 3 continents
  • Contribute your ideas to our unique projects and make an impact by turning them into reality
  • Balance your work and personal life through our workflow organization and decide yourself if you workat home, in the office, or on a hybrid setup
  • Annual performance review, and regular feedback cycles, generating distinct value by connecting colleagues through networks rather than hierarchies
  • Educational resources such as paid certifications, unlimited access to Udemy Business, etc.
  • Local, virtual, and global team events, in which UT colleagues become acquainted with one another
WHAT YOU’LL DO
  • Design, implement, and maintain scalable data pipelines using Databricks Lakehouse Platform, with a strong focus on Apache Spark, Delta Lake, and Unity Catalog.
  • Lead the development of batch and streaming data workflows that power analytics, machine learning, and business intelligence use cases.
  • Collaborate with data scientists, architects, and business stakeholders to translate complex data requirements into robust, production‑grade solutions.
  • Optimize performance and cost‑efficiency of Databricks clusters and jobs, leveraging tools like Photon, Auto Loader, and Job Workflows.
  • Establish and enforce best practices for data quality, governance, and security within the Databricks environment.
  • Mentor junior engineers and contribute to the evolution of the team’s Databricks expertise.
WHAT YOU’LL BRING
  • Deep hands‑on experience with Databricks on Azure, AWS, or GCP, including Spark (PySpark/Scala), Delta Lake, and MLflow.
  • Strong programming skills in Python or Scala, and experience with CI/CD pipelines (e.g., GitHub Actions, Azure DevOps).
  • Solid understanding of distributed computing, data modeling, and performance tuning in cloud‑native environments.
  • Familiarity with orchestration tools (e.g., Databricks Workflows, Airflow) and infrastructure‑as‑code (e.g., Terraform).
  • A proactive mindset, strong communication skills, and a passion for building scalable, reliable data systems.
  • Professional German, French & English communication skills (C1‑level, written and spoken).

Did we pique your interest, or do you have any questions?

Ultra Tendency is an international premier Data Engineering consultancy forBig Data, Cloud, Streaming,IIoT andMicroservices. We design, build, and operate large‑scale data‑driven applications for major enterprises such as the European Central Bank, HUK‑Coburg, Deutsche Telekom, and Europe’s largest car manufacturer. Founded in Germany in 2010, UT has developed a reliable client base and now runs 8 branches in 7 countries across 3 continents.

We do more than just leverage tech, we build it. At Ultra Tendency we contribute source code to +20 open‑source projects includingAnsible, Terraform,NiFi, andKafka. Our impact on tech and business is there for anyone to see. Enterprises seek out Ultra Tendency because we solve the problems others cannot.

We love the challenge: together, we tackle diverse and unique projects you will find nowhere else. In our knowledge community, you will be a part of a supportive network, not a hierarchy. Constant learning and feedback are our drivers for stable development. With us you can develop your individual career through work‑life balance.

We evaluate your application based on your skills and corresponding business requirements. Ultra Tendency welcomes applications from qualified candidates regardless of race, ethnicity, national or social origin, disability, sex, sexual orientation, or age.

Interested in building your career at Ultra Tendency? Get future opportunities sent straight to your email.

Accepted file types: pdf, doc, docx, txt, rtf

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.