Aktiviere Job-Benachrichtigungen per E-Mail!

DevOps Engineer - Data (f / m / d)

Operationalize Analytical Products (i.e. B.

Würzburg

Vor Ort

EUR 60.000 - 90.000

Vollzeit

Vor 27 Tagen

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

A leading company in the energy sector is seeking a DevOps expert to implement CI/CD pipelines and enhance data infrastructure. You will work on cloud-based solutions managing home energy devices, ensuring data security and quality, while collaborating with data engineers and contributing to a community of practice.

Qualifikationen

  • Several years of experience in building enterprise-grade GitLab CI/CD pipelines.
  • Experience in building BI dashboards for PowerBI and Google Looker Studio.
  • Proven knowledge in AWS Cloud services and data security measures.

Aufgaben

  • Implement CI/CD pipelines and DevOps practices focusing on ETL pipelines.
  • Support automation and maintenance of BI tools to ensure data quality metrics.
  • Assist in building and operationalizing self-serve data infrastructure.

Kenntnisse

GitLab CI/CD
Data Version Control
PowerBI
Google Looker Studio
Python
AWS Cloud services
Data security
Communication

Jobbeschreibung

In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack, and deliver a high level of cloud automation.

The products you will be working on belong to our team Digital Solutions | Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic systems, inverters, batteries, EV charging wall boxes, heat pumps, and meters as cloud-based solutions for our central and local units, enabling them to rollout these solutions to end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics, and control mechanisms for these devices.

meaningful & challenging - Your tasks
  1. Implement CI / CD pipelines and DevOps practices focusing on ETL pipelines for the Future Energy Home Data Team.
  2. Support automation and maintenance of BI tools like PowerBI and Google Looker Studio to ensure data quality metrics through automated applications.
  3. Assist data engineers in building and operationalizing self-serve data infrastructure across multiple cloud platforms (e.g., AWS, GCP, Azure).
  4. Ensure data security of deployed data products by implementing data access and anonymization methods (e.g., data masking, pseudonymization) in compliance with DPOs recommendations.
  5. Operationalize analytical products such as BI dashboards and Data Science ML models, and implement data quality metrics.
  6. Contribute to the „Community of Practice“ to foster collaboration and knowledge exchange.
authentic & ambitious - Your profile
  1. Several years of experience in building enterprise-grade GitLab CI/CD pipelines and Data Version Control for Python data pipeline applications (ETL).
  2. Several years of experience in building BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools).
  3. Profound experience in GCP BigQuery, Databricks (e.g., PySpark), Snowflake, Python Dask & Pandas, Python-Pytest, and Python Behave.
  4. Initial experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segmentor, or RBAC tools (e.g., Okta, AWS IAM, AWS Cognito) for democratized data access.
  5. Proven knowledge in AWS Cloud services like SageMaker, CloudFormation, CDK, Step Functions, CloudWatch, and deployment strategies like Blue-Green / Canary.
  6. Good communication skills and the ability to help others and contribute to the “Community of Practice”.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.