Social network you want to login/join with:
Location:
Job Category:
Other
EU work permit required:
Yes
Job Reference:
82a23c76542b
Job Views:
2
Posted:
06.05.2025
Expiry Date:
20.06.2025
Job Description:
About the team
In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack, and deliver a high level of cloud automation.
The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic systems, inverters, batteries, EV charging wall boxes, heat pumps, and meters as a cloud-based solution for our central and local units, enabling rollout to end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics, and control mechanisms.
Meaningful & challenging - Your tasks
- Responsible for implementing CI/CD and DevOps practices for the Future Energy Home Data Team, focusing on ETL pipelines.
- Support in automation and maintaining BI tools like PowerBI and Google Looker Studio for data quality metrics via automated applications.
- Support data engineers in building and operationalizing data self-serve infrastructure across multiple cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data security of deployed data products by implementing data access and anonymization methods (e.g., data masking, pseudonymization) in compliance with DPOs recommendations.
- Operationalize analytical products (e.g., BI dashboards and Data Science ML models) and implement data quality metrics.
Authentic & ambitious - Your profile
- Several years of experience in building enterprise-grade GitLab CI/CD pipelines, Data Version Control for Python data pipeline applications (ETL).
- Several years of experience in building BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools).
- Profound experience in GCP BigQuery, Databricks (e.g., PySpark), Snowflake, Python Dask & Pandas, Python-Pytest, and Python Behave.
- First experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segmentor, RBAC tools (e.g., Okta, AWS IAM, AWS Cognito) to democratize data access.
- Proven knowledge in AWS Cloud services like AWS SageMaker, CloudFormation, CDK, Step Functions, CloudWatch, and deployment strategies like Blue-Green/Canary.
- Good communication skills and ability to help others, contributing to a “Community of Practice”.
- We provide full flexibility: Do your work from home or any other place in Germany, including offices from Hamburg to Munich. Want more? Go on a workation for up to 20 days per year within Europe.
- Recharge your battery: 30 holidays per year plus Christmas and New Year’s Eve. You can exchange parts of your salary for more holidays or take a sabbatical.
- Your development: We grow and want you to grow with us. Learning on the job, exchanging with others, or taking part in individual training—our culture supports your growth.
- Let’s empower each other: Engage in our Digital Empowerment Communities for collaboration, learning, and networking.
- We elevate your mobility: Car and bike leasing offers, subsidized Deutschland-Ticket—your way is our way.
- Let’s think ahead: Company pension scheme and insurance packages to secure your future.
- This is by far not all: Further benefits will be discussed during the recruiting process.