Aktiviere Job-Benachrichtigungen per E-Mail!

Professional Platform Engineer – Azure Data Engineering

paiqo GmbH

Benhausen

Hybrid

EUR 60.000 - 80.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Zusammenfassung

A data-driven technology company in Nordrhein-Westfalen seeks a Professional Data Platform Engineer to manage complex data pipelines and integrate Azure services. Candidates should have 2-5 years of data engineering experience, strong SQL and Python skills, and a desire to work with the latest technology in a hybrid environment. Join an open and inclusive team culture focused on continuous learning and innovation.

Leistungen

Flexible working hours
State-of-the-art technology stack
Targeted further training
Innovation space
Diversity & inclusion
Real influence in tech decisions

Qualifikationen

  • 2–5 years of experience in data engineering, preferably with Azure Data Services.
  • Very good SQL and Python skills with experience in Delta Lake and streaming.
  • Knowledge of CI/CD and DevOps processes, cloud security, and governance.

Aufgaben

  • Plan, develop, and maintain batch and streaming pipelines.
  • Integrate Azure features such as mirroring and Delta Live Tables.
  • Automate deployments and testing using CI/CD and infrastructure as code.
  • Ensure data quality, lineage, and security using Microsoft Purview.
  • Collaborate with scientists and product owners on data solutions.
  • Evaluate new services for productive use.

Kenntnisse

Data engineering experience
SQL expertise
Python programming
Knowledge of Delta Lake
CI/CD processes knowledge
Cloud security understanding
Stream processing
Solution-oriented thinking
Strong communication skills
German language proficiency
English language proficiency

Tools

Azure Data Factory
Databricks
MS Fabric
Bicep
Terraform
Jobbeschreibung

We digitize decisions with data—would you like to get involved?

We shape our customers' data-driven future with scalable, secure, and automated Azure platforms. As a Professional Data Platform Engineer, you will be responsible for complex data pipelines: from zero-ETL replication of operational data (mirroring) to Delta Lake-based lakehouses. Our vision: self-service data access for all departments, supported by DataOps and governance.

Tasks & Responsibilities:
  • Planning, development, and maintenance of robust batch and streaming pipelines with Azure Data Factory, MS Fabric and Databricks

  • Integration of new Azure features such as mirroring (zero ETL), Delta Live Tables, and Unity Catalog for centralized governance

  • Automation of deployments and testing using CI/CD (Azure DevOps or GitHub Actions) and infrastructure as code (Bicep/Terraform)

  • Ensuring data quality, lineage, and security—using Microsoft Purview and role-based access control

  • Collaborating with data scientists, product owners, and customers to translate requirements into scalable data solutions

  • Evaluating new services such as Lakeflow or Microsoft Fabric for productive use

What we offer you
  • Flexible working: Trust-based working hours, hybrid working, and remote working possible (residence in Germany)

  • State-of-the-art technology stack: Work with Fabric, Delta Lake, Databricks, Mirroring—ideal for tech-savvy juniors

  • Targeted further training: Working hours for training, certifications, and mentoring

  • Innovation space: Opportunity to test new tools and frameworks and develop proof-of-concepts

  • Diversity & inclusion: We welcome all applicants and promote an inclusive environment; your ideas are important to us

  • Real influence: You can help shape technology decisions and contribute your ideas directly to product roadmaps

If you want to take on responsibility, enjoy working with the latest Azure technology, and value an open, learning-oriented team culture, we look forward to receiving your application!

  • 2–5 years of experience in data engineering, preferably with Azure Data Services (Data Factory, Databricks, MS Fabric)

  • Very good SQL and Python skills; experience with Delta Lake, streaming (Event Hubs, Kafka), and data modeling

  • Knowledge of CI/CD and DevOps processes as well as cloud security and governance

  • Solution-oriented thinking, strong communication skills, and willingness to learn new things

  • Very good German and good English skills

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.