Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Databricks Data Engineer (m / f / •)

Ultra Tendency

Magdeburg

Hybrid

EUR 80.000 - 100.000

Vollzeit

Vor 14 Tagen

Zusammenfassung

A leading Data Engineering consultancy in Germany is seeking a Senior Databricks Data Engineer to design and implement scalable data pipelines. You will optimize performance, mentor junior engineers, and collaborate with diverse teams. Ideal candidates have hands-on experience with Databricks and strong programming skills in Python or Scala. This role offers flexibility in work location and a chance to impact significant projects.

Leistungen

Professional development opportunities
Individual development plan
Flexible work location
Paid certifications access

Qualifikationen

  • Expertise in Databricks on Azure, AWS, or GCP.
  • Solid understanding of distributed computing and performance tuning.
  • Professional communication skills in German & English.

Aufgaben

  • Design and maintain scalable data pipelines.
  • Lead batch and streaming data workflow development.
  • Collaborate to translate data requirements into solutions.
  • Optimize Databricks cluster performance.
  • Mentor junior engineers.

Kenntnisse

Deep hands-on experience with Databricks
Strong programming skills in Python or Scala
Familiarity with orchestration tools
Proactive mindset and strong communication skills

Tools

Databricks
Azure
GCP
Apache Spark
Jobbeschreibung
Overview

Our Engineering community is growing and were now looking for a Senior Databricks Data Engineer to join our team in Germany supporting our global growth.

As Senior Databricks Data Engineer you design and optimize data processing algorithms on a talented cross-functional team. You are familiar with the Apache open-source suite of technologies and want to contribute to the advancement of data engineering.

What we offer
  • A chance to accelerate your career and work with outstanding colleagues in a supportive learning community split across 3 continents
  • Contribute your ideas to our unique projects and make an impact by turning them into reality
  • Balance your work and personal life through our workflow organization and decide yourself if you workat home in the office or on a hybrid setup
  • Annual performance review and regular feedback cycles generating distinct value by connecting colleagues through networks rather than hierarchies
  • Individual development plan professional development opportunities
  • Educational resources such as paid certifications unlimited access to Udemy Business etc.
  • Local virtual and global team events in which UT colleagues become acquainted with one another
What youll do
  • Design implement and maintain scalable data pipelines using Databricks Lakehouse Platform with a strong focus on Apache Spark Delta Lake and Unity Catalog.
  • Lead the development of batch and streaming data workflows that power analytics machine learning and business intelligence use cases.
  • Collaborate with data scientists architects and business stakeholders to translate complex data requirements into robust production-grade solutions.
  • Optimize performance and cost-efficiency of Databricks clusters and jobs leveraging tools like Photon Auto Loader and Job Workflows.
  • Establish and enforce best practices for data quality governance and security within the Databricks environment.
  • Mentor junior engineers and contribute to the evolution of the teams Databricks expertise.
What youll bring
  • Deep hands-on experience with Databricks on Azure AWS or GCP including Spark (PySpark / Scala) Delta Lake and MLflow.
  • Strong programming skills in Python or Scala and experience with CI / CD pipelines (e.g. GitHub Actions Azure DevOps).
  • Solid understanding of distributed computing data modeling and performance tuning in cloud-native environments.
  • Familiarity with orchestration tools (e.g. Databricks Workflows Airflow) and infrastructure-as-code (e.g. Terraform).
  • A proactive mindset strong communication skills and a passion for building scalable reliable data systems.
  • Professional communications skills in German & English

Did we pique your interest or do you have any questions

We want to hear from you : contact us at

About Us

Ultra Tendency is an international premier Data Engineering consultancy for Big Data Cloud Streaming IIoT and Microservices. We design build and operate large-scale data-driven applications for major enterprises such as the European Central Bank HUK-Coburg Deutsche Telekom and Europes largest car manufacturer. Founded in Germany in 2010 UT has developed a reliable client base and now runs 8 branches in 7 countries across 3 continents.

We do more than just leverage tech we build it. At Ultra Tendency we contribute source code to 20 open-source projects including Ansible Terraform NiFi and Kafka. Our impact on tech and business is there for anyone to see. Enterprises seek out Ultra Tendency because we solve the problems others cannot.

We love the challenge : together we tackle diverse and unique projects you will find nowhere our knowledge community you will be a part of a supportive network not a hierarchy. Constant learning and feedback are our drivers for stable development. With us you can develop your individual career through work-life balance.

We evaluate your application based on your skills and corresponding business requirements. Ultra Tendency welcomes applications from qualified candidates regardless of race ethnicity national or social origin disability sex sexual orientation or age.

Data privacy statement : Data Protection for Applicants Ultra Tendency

Details
  • Employment Type : Full Time
  • Experience : years
  • Vacancy : 1
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.