Aktiviere Job-Benachrichtigungen per E-Mail!

DevOps Engineer - Data (f/m/d)

Operationalize Analytical Products (i.e. B.

Frankfurt

Vor Ort

EUR 60.000 - 85.000

Vollzeit

Vor 3 Tagen
Sei unter den ersten Bewerbenden

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

A leading company in digital solutions is looking for a DevOps Engineer specializing in data to enhance their cloud automation and analytics capabilities. The role involves implementing CI/CD strategies, ensuring data quality, and operationalizing analytical products. Candidates should have several years of experience with ETL pipelines, BI tools, and cloud platforms.

Qualifikationen

  • Several years of experience in enterprise-grade GitLab CI/CD.
  • Experience with BI dashboards.
  • Proven knowledge in AWS Cloud services.

Aufgaben

  • Implement CI/CD and DevOps practices focusing on ETL pipelines.
  • Support automation and maintenance of BI tools.
  • Assist data engineers in building self-service data infrastructure.

Kenntnisse

ETL Pipelines
Automation
Data Security
Communication

Tools

PowerBI
Google Looker Studio
GCP BigQuery
Databricks
Python
AWS Cloud Services

Jobbeschreibung

Social network you want to login/join with:

DevOps Engineer - Data (f/m/d), Frankfurt am Main

Client:

Operationalize Analytical Products (i.e., B2B, B2C, etc.)

Location:
Job Category:

Other

-

EU work permit required:

Yes

Job Reference:

39c4a97e08f8

Job Views:

5

Posted:

21.06.2025

Expiry Date:

05.08.2025

Job Description:

About the team

In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack, and deliver a high level of cloud automation.

The products you will be working on belong to our team Digital Solutions | Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic systems, inverters, batteries, EV charging stations, heat pumps, and meters as a cloud-based solution for our central and local units, enabling rollout to end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics, and control mechanisms.

Meaningful & challenging - Your tasks

  • Implementing CI/CD and DevOps practices focusing on ETL pipelines.
  • Supporting automation and maintenance of BI tools like PowerBI and Google Looker Studio to ensure data quality metrics through automated applications.
  • Assisting data engineers in building and operationalizing self-service data infrastructure across multiple cloud platforms (e.g., AWS, GCP, Azure).
  • Ensuring data security of deployed products by implementing data access and anonymization methods (e.g., data masking, pseudonymization) in compliance with DPO recommendations.
  • Operationalizing analytical products (e.g., dashboards and ML models) and implementing data quality metrics.

Authentic & ambitious - Your profile

  • Several years of experience building enterprise-grade GitLab CI/CD pipelines and Data Version Control for Python data pipeline applications (ETL).
  • Experience in building BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools).
  • Profound experience with GCP BigQuery, Databricks (e.g., PySpark), Snowflake, Python Dask & Pandas, Python-Pytest, and Python Behave.
  • Experience implementing and using Attribute-Based Access Control (ABAC) tools like Immuta, Segmentor, or RBAC tools (e.g., Okta, AWS IAM, AWS Cognito) for data access management.
  • Proven knowledge of AWS Cloud services such as SageMaker, CloudFormation, CDK, Step Functions, CloudWatch, and deployment strategies like Blue-Green/Canary.
  • Good communication skills and ability to contribute to a community of practice.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.