Social network you want to login/join with:
col-narrow-left
Client:
Operationalize Analytical Products (i.e. B.
Location:
Job Category:
Other
-
EU work permit required:
Yes
col-narrow-right
Job Reference:
b9a0a3e9ce8f
Job Views:
1
Posted:
11.05.2025
Expiry Date:
25.06.2025
col-wide
Job Description:
About the team
In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack and deliver a high level of cloud automation.
The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.
Meaningful & challenging - Your tasks
- Implementing CI/CD and DevOps practices for the Future Energy Home Data Team, focusing on ETL pipelines.
- Supporting automation and maintenance of BI Tools like PowerBI and Google Looker Studio for data quality metrics via the implementation of automated applications.
- Supporting data engineers in building and operationalizing data self-serve infrastructure across multiple cloud platforms (e.g., AWS, GCP, Azure).
- Ensuring data security of the data products being deployed by implementing data access and anonymization methods (e.g., data masking, pseudonymization) in compliance with DPOs recommendations.
- Operationalizing Analytical Products (e.g., BI dashboards and Data Science ML models) and implementing data quality metrics.
Authentic & ambitious - Your profile
- Several years of experience in building enterprise-grade GitLab CI/CD pipelines, Data Version Control for Python data pipeline applications (ETL).
- Several years of experience in building BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools).
- Profound experience in GCP BigQuery, Databricks (e.g., PySpark), Snowflake, Python Dask & Pandas, Python-Pytest and Python Behave.
- First experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segmentor, RBAC tools (e.g., Okta, AWS IAM, AWSCognito) to democratize data access.
- Proven knowledge in AWS Cloud with services like AWS SageMaker, CloudFormation, CDK, AWS Step Functions, AWS CloudWatch, and deployment strategies like Blue-Green/Canary.
- Appropriate communication skills and ability to help others and contribute to a “Community of Practice”.
- We provide full flexibility: Do your work from home or any other place in Germany - including our offices from Hamburg to Munich. Want even more? Go on a workation for up to 20 days per year within Europe.
- Recharge your battery: You have 30 holidays per year plus Christmas and New Year's Eve. You can exchange parts of your salary for more holidays or take a sabbatical.
- Your development: We grow and want you to grow with us. Learning on the job, exchanging with others, or taking part in training helps elevate your personal and professional growth.
- Let’s empower each other: Engage in our Digital Empowerment Communities for collaboration, learning, and networking.
- We elevate your mobility: From car and bike leasing offers to a subsidized Deutschland-Ticket.
- Let’s think ahead: Our company pension scheme and insurance packages support your future.
- This is by far not all: We look forward to discussing further benefits during the recruitment process.