Aktiviere Job-Benachrichtigungen per E-Mail!

DevOps Engineer - Data (f / m / d)

Operationalize Analytical Products (i.e. B.

Frankfurt

Vor Ort

EUR 55.000 - 90.000

Vollzeit

Vor 30+ Tagen

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

Ein innovatives Unternehmen sucht einen erfahrenen Fachmann für die Implementierung von CI/CD und DevOps-Praktiken im Bereich Future Energy Home. Diese spannende Position erfordert umfangreiche Kenntnisse in der Entwicklung und dem Betrieb von Softwarelösungen zur Verwaltung von Heimenergiegeräten. Sie werden an der Automatisierung von BI-Tools mitarbeiten und sicherstellen, dass Datenprodukte sicher und effizient bereitgestellt werden. In einem dynamischen und flexiblen Arbeitsumfeld haben Sie die Möglichkeit, von zu Hause oder von überall in Deutschland aus zu arbeiten und an einer Vielzahl von Weiterbildungsangeboten teilzunehmen. Wenn Sie leidenschaftlich an Technologie interessiert sind und eine bedeutende Rolle in einem zukunftsorientierten Team spielen möchten, freuen wir uns auf Ihre Bewerbung.

Leistungen

Flexibles Arbeiten
30 Urlaubstage
Sabbatical-Möglichkeiten
Individuelle Trainings
Mobilitätsangebote
Betriebliche Altersvorsorge
Arbeiten im Homeoffice

Qualifikationen

  • Erfahrung im Aufbau von GitLab CI/CD-Pipelines und ETL-Datenpipelines.
  • Kenntnisse in der Nutzung von BI-Tools wie PowerBI und Google Looker Studio.

Aufgaben

  • Implementierung von CI/CD und DevOps-Praktiken für das Future Energy Home Data Team.
  • Unterstützung bei der Automatisierung und Wartung von BI-Tools für Datenqualitätsmetriken.

Kenntnisse

CI/CD
ETL Pipelines
PowerBI
Google Looker Studio
Python
Data Security
Data Quality Metrics
Attribute-Based Access Control (ABAC)
AWS Cloud

Tools

GitLab
GCP BigQuery
Databricks
Snowflake
Python Dask
Python-Pytest
Python Behave

Jobbeschreibung

In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack and deliver a high level of cloud automation.

The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.

Meaningful & Challenging - Your Tasks
  1. Responsible for implementing CI/CD and DevOps practices for Future Energy Home Data Team focusing on ETL pipelines.
  2. Support in automation and maintenance of BI Tools like PowerBI and Google Looker Studio for data quality metrics via the implementation of automated applications.
  3. Support data engineers in building and operationalizing data self-serve infrastructure across multiple cloud platforms (e.g. AWS, GCP, Azure).
  4. Ensure data security of the data products being deployed by implementing data access and anonymization methods (e.g. data masking, data pseudonymization, etc) in compliance with DPOs recommendation.
  5. Operationalize Analytical Products (i.e. BI dashboards and Data Science ML models) and implement data quality metrics.
  6. Contribute to “Community of Practice” to actively foster collaboration and exchange.
Authentic & Ambitious - Your Profile
  1. Several years of experience in building enterprise-grade GitLab CI/CD pipelines, Data Version Control for Python data pipeline applications (ETL).
  2. Several years of experience in building BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools).
  3. Profound experience in GCP BigQuery, Databricks (e.g. PySpark), Snowflake, Python Dask & Pandas, Python-Pytest and Python Behave.
  4. First experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segment or RBAC tools (e.g. Okta, AWS IAM, AWS Cognito) to democratize data access.
  5. Proven knowledge in AWS Cloud with service usage like AWS SageMaker, CloudFormation, CDK, AWS Step Functions, AWS CloudWatch and Blue-Green / Canary deployment strategies.
  6. Appropriate communication skills and ability to help others and contribute to “Community of Practice.”
Benefits
  • We provide full flexibility: Do your work from home or any other place in Germany - of course including all our great offices from Hamburg to Munich. You want even more? Go on workation for up to 20 days per year within Europe.
  • Recharge your battery: You have 30 holidays per year plus Christmas and New Year's Eve on top. Your battery still needs charging? You can exchange parts of your salary for more holidays or you can take a sabbatical.
  • Your development: We grow and we want you to grow with us. Learning on the job, exchanging with others or taking part in an individual training - Our learning culture enables you to bring your personal and professional development to the next level.
  • Let’s empower each other: Take the opportunity to engage in our Digital Empowerment Communities for collaboration, learning, and network building.
  • We elevate your mobility: From car and bike leasing offers to a subsidised Deutschland-Ticket - your way is our way.
  • Let’s think ahead: With our company pension scheme and a great insurance package we take care of your future.
  • This is by far not all: We are looking forward to speaking with you about further benefits during the recruiting process.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.