Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Engineer (f / m / d)

The chapter Technology & Engineering - Data & A

Frankfurt

Remote

EUR 60.000 - 100.000

Vollzeit

Vor 30+ Tagen

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

Ein innovatives Unternehmen im Bereich Daten- und KI-Engineering sucht nach einem erfahrenen Data Engineer, der die Möglichkeit hat, an spannenden Projekten zur Entwicklung von Software für das Management von Heimenergiegeräten zu arbeiten. In dieser Rolle sind Sie verantwortlich für die Integration und den Aufbau robuster Datenpipelines in einer Cloud-Umgebung, während Sie mit modernsten Technologien wie AWS und Azure arbeiten. Das Unternehmen bietet eine flexible Arbeitsumgebung, die es Ihnen ermöglicht, von zu Hause oder von einem anderen Ort in Deutschland zu arbeiten. Darüber hinaus erhalten Sie zahlreiche Vorteile, einschließlich großzügiger Urlaubsregelungen und individueller Weiterbildungsmöglichkeiten. Wenn Sie leidenschaftlich daran interessiert sind, innovative Lösungen zu entwickeln und Ihre Karriere voranzutreiben, ist dies die perfekte Gelegenheit für Sie.

Leistungen

Flexible Arbeitszeiten
30 Urlaubstage
Sabbatical-Möglichkeiten
Unterstützung bei der Mobilität
Betriebliche Altersvorsorge
Individuelle Weiterbildung

Qualifikationen

  • 5+ Jahre Erfahrung in der Entwicklung von Python-Datenpipelines (ETL) mit AWS/Azure.
  • Kenntnisse in Databricks, Snowflake und DataOps-Pipelines mit GitLab.

Aufgaben

  • Integration von Datenplattform-Pipelines in AWS und Azure unter Verwendung von Data Mesh.
  • Mentoring und Coaching von Entwicklern sowie aktive Teilnahme an der Community of Practice.

Kenntnisse

Python
Data Engineering
AWS
Azure
DataOps
Clean Code
SOLID Principles
Communication Skills

Tools

Databricks
Snowflake
GitLab
CloudFormation
Terraform
AWS Step Functions

Jobbeschreibung

The chapter Technology & Engineering - Data & AI Engineering represents all experts who deliver data and AI engineering capabilities to our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s MLOps & AIOps tech stack and deliver a high level of data delivery & model automation.

The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.

Meaningful & Challenging - Your Tasks
  1. Integrate cross clouds (AWS & Azure) Data Platform Pipelines, using Data Mesh and Data Fabric architecture concepts.
  2. Implement data sharing interfaces or connectors to share business data (e.g. solar telemetry, electric vehicle charging data, etc.) to our regional business data consumers.
  3. Build robust data pipeline applications with AWS and Azure data services using software principles such as Clean Code and SOLID principles.
  4. Work closely with Data Solutions Architects to understand and shape overarching Data Architecture for data sharing interfaces and connectors.
  5. Mentor and coach others, conduct pair-programming sessions and review merge requests as well as active contribution to the “Community of Practice.”
Authentic & Ambitious - Your Profile
  1. At least 5 years experience in building enterprise-grade Python data pipeline (ETL) applications using software best practices such as Clean Code / SOLID principles in AWS / Azure.
  2. At least 3 years of experience in relevant AWS Services for Data Engineering (e.g. Athena, Lambda, Glue, AWS IAM & CloudWatch) and also background in Azure.
  3. Profound knowledge in Databricks (e.g. PySpark), Snowflake, Python Pandas, Python-Pytest and Python Behave.
  4. Experience building DataOps pipelines with GitLab, CloudFormation, Terraform or CDK and using orchestration tools (e.g. AWS Step Functions).
  5. Preferable experience in Data Modelling Concepts such as Data Vault 2.0 and Dimensional Data Modelling.
  6. Excellent communication skills and the ability to mentor and coach other developers.
What We Offer
  • We provide full flexibility: Do your work from home or any other place in Germany - of course including all our great offices from Hamburg to Munich. You want even more? Go on workation for up to 20 days per year within Europe.
  • Recharge your battery: You have 30 holidays per year plus Christmas and New Year's Eve on top. Your battery still needs charging? You can exchange parts of your salary for more holidays or you can take a sabbatical.
  • Your development: We grow and we want you to grow with us. Learning on the job, exchanging with others or taking part in an individual training - Our learning culture enables you to bring your personal and professional development to the next level.
  • Let’s empower each other: Take the opportunity to engage in our Digital Empowerment Communities for collaboration, learning, and network building.
  • We elevate your mobility: From car and bike leasing offers to a subsidised Deutschland-Ticket - your way is our way.
  • Let’s think ahead: With our company pension scheme and a great insurance package we take care of your future.
  • This is by far not all: We are looking forward to speaking with you about further benefits during the recruiting process.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.