Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Engineer (f/m/d)

TN Germany

Würzburg

Vor Ort

EUR 60.000 - 100.000

Vollzeit

Vor 10 Tagen

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

An innovative firm is seeking a skilled Data Engineer to join their dynamic team focused on developing cutting-edge solutions for managing home energy devices. In this role, you will integrate cross-cloud data platforms, build robust data pipelines, and collaborate closely with architects to shape data architecture. Your expertise in Python and cloud services will be pivotal in enhancing data delivery and model automation. This is a fantastic opportunity to contribute to sustainable energy solutions while mentoring and coaching fellow developers in a collaborative environment.

Qualifikationen

  • 5+ years of experience in building enterprise-grade Python data pipeline applications.
  • Strong knowledge of AWS services and Azure for Data Engineering.

Aufgaben

  • Integrate cross-cloud Data Platform Pipelines using AWS & Azure.
  • Build robust data pipeline applications following Clean Code and SOLID principles.

Kenntnisse

Python
Data Engineering
AWS Services
Azure
DataOps
Communication Skills

Tools

Databricks
Snowflake
GitLab
CloudFormation
Terraform
AWS Step Functions

Jobbeschreibung

Social network you want to login/join with:

The chapter Technology & Engineering - Data & AI Engineering represents all experts who deliver data and AI engineering capabilities to our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s MLOps & AIOps tech stack and deliver a high level of data delivery & model automation.

The products you will be working on belong to our team Digital Solutions | Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic systems, inverters, batteries, EV charging wall boxes, heat pumps, and meters as a cloud-based solution for our central and local units, enabling rollout to end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics, and control mechanisms for these devices.

Your tasks
  • Integrate cross-cloud Data Platform Pipelines (AWS & Azure), using Data Mesh and Data Fabric architecture concepts.
  • Implement data sharing interfaces or connectors to share business data (e.g., solar telemetry, electric vehicle charging data) with regional business data consumers.
  • Build robust data pipeline applications with AWS and Azure data services, following software principles such as Clean Code and SOLID.
  • Work closely with Data Solutions Architects to understand and shape overarching Data Architecture for data sharing interfaces and connectors.
  • Mentor and coach others, conduct pair programming sessions, review merge requests, and actively contribute to the 'Community of Practice'.
Your profile
  • At least 5 years of experience building enterprise-grade Python data pipeline (ETL) applications using software best practices such as Clean Code/SOLID principles in AWS/Azure.
  • At least 3 years of experience with relevant AWS Services for Data Engineering (e.g., Athena, Lambda, Glue, IAM, CloudWatch) and a background in Azure.
  • Profound knowledge in Databricks (e.g., PySpark), Snowflake, Python Pandas, Python-Pytest, and Python Behave.
  • Experience building DataOps pipelines with GitLab, CloudFormation, Terraform or CDK, and orchestration tools (e.g., AWS Step Functions).
  • Preferable experience in Data Modeling Concepts such as Data Vault 2.0 and Dimensional Data Modeling.
  • Excellent communication skills and the ability to mentor and coach other developers.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.