In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack, and deliver a high level of cloud automation.
The products you will be working on belong to our team Digital Solutions | Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic systems, inverters, batteries, EV charging wall boxes, heat pumps, and meters as a cloud-based solution for our central and local units, enabling them to roll out these solutions to end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics, and control mechanisms for these devices.
Meaningful & Challenging - Your Tasks
- Implement CI / CD and DevOps practices for the Future Energy Home Data Team, focusing on ETL pipelines.
- Support automation and maintenance of BI tools like PowerBI and Google Looker Studio for data quality metrics through the implementation of automated applications.
- Assist data engineers in building and operationalizing data self-service infrastructure across multiple cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data security of deployed data products by implementing data access and anonymization methods (e.g., data masking, pseudonymization) in compliance with DPOs recommendations.
- Operationalize analytical products (e.g., BI dashboards and Data Science ML models) and implement data quality metrics.
- Contribute to the 'Community of Practice' to foster collaboration and exchange.
Authentic & Ambitious - Your Profile
- Several years of experience in building enterprise-grade GitLab CI/CD pipelines and Data Version Control for Python data pipeline applications (ETL).
- Extensive experience in building BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools).
- Profound experience in GCP BigQuery, Databricks (e.g., PySpark), Snowflake, Python Dask & Pandas, Python-Pytest, and Python Behave.
- First experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segmentor, RBAC tools (e.g., Okta, AWS IAM, AWS Cognito) to democratize data access.
- Proven knowledge in AWS Cloud services like SageMaker, CloudFormation, CDK, Step Functions, CloudWatch, and deployment strategies like Blue-Green / Canary.
- Good communication skills and the ability to help others and contribute to the 'Community of Practice.'
We offer:
- Full flexibility: Work from home or any location in Germany, including our offices from Hamburg to Munich. Up to 20 days per year of workation within Europe.
- Recharge your batteries: 30 holidays plus Christmas and New Year's Eve. Option to exchange salary parts for more holidays or take a sabbatical.
- Your development: Opportunities for on-the-job learning, training, and growth.
- Empowerment: Engage in our Digital Empowerment Communities for collaboration and learning.
- Mobility: Car and bike leasing, subsidized Deutschland-Ticket.
- Future planning: Company pension scheme and insurance packages.
- Further benefits: Discussed during the recruitment process.