Senior Data Engineer (f/m/d)

Sei unter den ersten Bewerbenden.
Nur für registrierte Mitglieder
Frankfurt
Remote
EUR 70.000 - 90.000
Sei unter den ersten Bewerbenden.
Vor 7 Tagen
Jobbeschreibung

Social network you want to login/join with:

Senior Data Engineer (f/m/d), Frankfurt am Main

col-narrow-left

Client:

The chapter Technology & Engineering - Data & A

Location:

Job Category:

-

EU work permit required:

Yes

col-narrow-right

Job Reference:

2a99fa7765b8

Job Views:

1

Posted:

16.05.2025

Expiry Date:

30.06.2025

col-wide

Job Description:

About the team

The chapter Technology & Engineering - Data & AI Engineering represents all experts who deliver data and AI engineering capabilities to our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s MLOps & AIOps tech stack and deliver a high level of data delivery & model automation.

The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.

Meaningful & challenging - Your tasks

  • Integrate cross-cloud Data Platforms and Pipelines, using Data Mesh and Data Fabric architecture concepts.
  • Implement data sharing interfaces or connectors to share business data (e.g., solar telemetry, electric vehicle charging data) with regional business data consumers.
  • Build robust data pipeline applications with AWS and Azure data services, adhering to software principles such as Clean Code and SOLID principles.
  • Work closely with Data Solutions Architects to understand and shape overarching Data Architecture for data sharing interfaces and connectors.
  • Mentor and coach others, conduct pair programming sessions, review merge requests, and actively contribute to the „Community of Practice“.

Authentic & ambitious - Your profile

  • At least 5 years of experience in building enterprise-grade Python data pipeline (ETL) applications using software best practices such as Clean Code/SOLID principles in AWS/Azure.
  • At least 3 years of experience in relevant AWS Services for Data Engineering (e.g., Athena, Lambda, Glue, AWS IAM & CloudWatch) and a background in Azure.
  • Profound knowledge in Databricks (e.g., PySpark), Snowflake, Python Pandas, Python-Pytest, and Python Behave.
  • Experience building DataOps pipelines with GitLab, CloudFormation, Terraform, or CDK, and using orchestration tools (e.g., AWS Step Functions).
  • Preferable experience in Data Modeling Concepts such as DataVault 2.0 and Dimensional Data Modeling.
  • Excellent communication skills and the ability to mentor and coach other developers.

We provide full flexibility: Work from home or any other place in Germany, including our offices from Hamburg to Munich. Up to 20 days per year workation within Europe is also possible.

Recharge your battery: 30 holidays per year plus Christmas and New Year's Eve. Additional options include exchanging parts of your salary for more holidays or taking a sabbatical.

Your development: We support your growth through on-the-job learning, exchanges, and individual training to enhance your personal and professional development.

Let’s empower each other: Engage in our Digital Empowerment Communities for collaboration, learning, and networking.

We elevate your mobility: Car and bike leasing offers, subsidized Deutschland-Ticket, etc.

Let’s think ahead: Company pension scheme and comprehensive insurance packages for your future.

This is by far not all: Further benefits will be discussed during the recruitment process.