Social network you want to login/join with:
The chapter Technology & Engineering - Data & AI Engineering represents all experts who deliver data and AI engineering capabilities to our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s MLOps & AIOps tech stack, and deliver a high level of data delivery & model automation.
The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps, and meters as a cloud-based solution for our central and local units, enabling the rollout of these solutions to end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics, and control mechanisms for these devices.
Your tasks
- Integrate cross-cloud Data Platform Pipelines (AWS & Azure), using Data Mesh and Data Fabric architecture concepts.
- Implement data sharing interfaces or connectors to share business data (e.g., solar telemetry, electric vehicle charging data) with regional business data consumers.
- Build robust data pipeline applications with AWS and Azure data services, following software principles such as Clean Code and SOLID principles.
- Work closely with Data Solutions Architects to understand and shape overarching Data Architecture for data sharing interfaces and connectors.
- Mentor and coach others, conduct pair programming sessions, review merge requests, and actively contribute to the 'Community of Practice'.
Your profile
- At least 5 years of experience in building enterprise-grade Python data pipeline (ETL) applications using software best practices such as Clean Code/SOLID principles in AWS/Azure.
- At least 3 years of experience with relevant AWS services for Data Engineering (e.g., Athena, Lambda, Glue, AWS IAM & CloudWatch) and a background in Azure.
- Proficient knowledge in Databricks (e.g., PySpark), Snowflake, Python Pandas, Python-Pytest, and Python Behave.
- Experience building DataOps pipelines with GitLab, CloudFormation, Terraform or CDK, and orchestration tools (e.g., AWS Step Functions).
- Preferable experience in Data Modeling Concepts such as Data Vault 2.0 and Dimensional Data Modeling.
- Excellent communication skills and the ability to mentor and coach other developers.
We provide full flexibility
- Work from home or any other place in Germany, including offices from Hamburg to Munich. Up to 20 days per year of workation within Europe.
- 30 holidays per year plus Christmas and New Year's Eve. Option to exchange parts of salary for more holidays or take a sabbatical.
- Opportunities for personal and professional development through on-the-job learning, training, and knowledge exchange.
- Engage in Digital Empowerment Communities for collaboration, learning, and networking.
- Mobility benefits including car and bike leasing, and a subsidized Deutschland-Ticket.
- Company pension scheme and comprehensive insurance packages to secure your future.