Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Platform Engineer

kaiko.ai

Zürich

Vor Ort

CHF 90’000 - 120’000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Zusammenfassung

A healthcare technology company is seeking a Senior Data Platform Engineer to build and maintain a scalable data platform for AI training. This role is pivotal in managing data pipelines and services in compliance with privacy regulations. The ideal candidate will have over 4 years of experience, particularly with open-source tools. This position is based in Zurich, requiring in-office collaboration at least 50% of the time.

Leistungen

Attractive and competitive salary
25 vacation days per year
EUR 1000 learning and development budget
Annual commuting subsidy

Qualifikationen

  • 4+ years of experience in building production data platforms.
  • Experience with modern storage formats (e.g., Parquet, Delta).
  • Ability to manage infrastructure and deployment for data services.

Aufgaben

  • Design and maintain scalable data pipelines.
  • Collaborate with teams to support their data needs.
  • Implement data platform services for self-serve access.

Kenntnisse

Experience building production data platforms and data pipelines
Proficiency in Python or another suitable language
Experience with open-source software
Ability to thrive in a fast-paced environment

Tools

Apache Spark
Delta Lake
Python
Jobbeschreibung

Joining to apply for the Senior Data Platform Engineer role at kaiko.ai

Get AI-powered advice on this job and more exclusive features.

Delivering high quality cancer care is complex; specialists form a view of each patient\'s condition by reasoning across different data – CT scans, genomics context, treatment history and clinical notes.

Current AI are powerful within domains but fall short when it comes to reasoning across data or domain areas. kaiko.w, our AI assistant for oncology, aims to equip every clinician with a full understanding of their patients, helping them to reason across data as they assess each case.

We’re building this in close collaboration with the Netherlands Cancer Institute (NKI) and a growing network of hospitals and research centers. We’ve raised significant long-term funding and have nearly doubled our team over the past year. We’re now 80+ people representing 25 nationalities, based across our offices in Zurich and Amsterdam

About the role

Our team is tackling the challenge of building a secure, scalable data platform that can process massive, multimodal datasets for training cutting-edge foundation models — while also handling sensitive medical data directly within hospital environments.

As a Senior Data Platform Engineer, you’ll design, deploy, and maintain a self-hosted, open-source–driven infrastructure that ingests, transforms, and serves data for both AI training pipelines and real-time hospital applications. Your work will ensure datasets are accurate, reproducible, and compliant with strict privacy regulations.

This is a rare opportunity to shape a platform from the ground up—working with state-of-the-art tooling, solving unique scaling and security problems, and owning key technical decisions that will directly impact groundbreaking AI models and healthcare workflows.

You will be based in either The Netherlands or Switzerland, with the expectation of spending at least 50% of your time at the office.

Some areas of responsibility
  • Design, implement, and maintain scalable and reproducible data pipelines to support machine learning workflows and analytics use cases.
  • Build and maintain internal data platform services that abstract away complexity and promote self-serve data access and processing.
  • Own infrastructure and deployment for data exploration, transformation, and storage - including orchestration, containerization, and monitoring.
  • Manage, deploy, and champion the use of standard open-source tooling and products in the area of data platform and data engineering.
  • Collaborate across domain teams with researchers, product teams and other stakeholders to support their data needs.
About you
  • 4+ years of experience building production data platform and data pipelines.
  • Experience in providing in-house data platform and data lakehouse services based on open-source software such as Apache Spark, Delta Lake, DuckDB, Dask, JupyterHub.
  • Proficiency with modern storage formats (e.g., Parquet, Delta, Iceberg) and object stores (e.g., S3, MinIO, Azure Blob).
  • Solid programming skills in Python or another language suitable for data workflows (e.g., Scala or Java).
  • Ability to thrive in a fast-paced, startup environment with a high degree of ownership.

Nice to have:

  • Experience in AI/ML environment.
  • Understanding of data standards in the medical domain, such as DICOM, FHIR, pathology slide images (Whole Slide Images).
  • Knowledge of monitoring, logging, alerting and observability tools (e.g. Prometheus, Grafana, ELK Stack or Datadog)

We are excited to gather a broad range of perspectives in our team, as we believe it will help us build better products to support a broader set of people. If you’re excited about us but don’t fit every single qualification, we still encourage you to apply: we’ve had incredible team members join us who didn’t check every box!

At kaiko, we believe the best ideas come from collaboration, ownership and ambition. We’ve built a team of international experts where your work has direct impact. Here’s what we value:

  • Ownership: You’ll have the autonomy to set your own goals, make critical decisions, and see the direct impact of your work.
  • Collaboration: You’ll have to approach disagreement with curiosity, build on common ground and create solutions together.
  • Ambition: You’ll be surrounded by people who set high standards for themselves and others, who see obstacles as opportunities, and who are relentless in their work to create better outcomes for patients.

In addition, we offer:

  • An attractive and competitive salary, a good pension plan and 25 vacation days per year.
  • Great offsites and team events to strengthen the team and celebrate successes together.
  • A EUR 1000 learning and development budget to help you grow.
  • Autonomy to do your work the way that works best for you, whether you have a kid or prefer early mornings.
  • An annual commuting subsidy.

Our interview process is designed to assess mutual fit across skills, motivation, and values. It typically includes the following steps:

  • Screening call: A short conversation to align on your motivation, career goals, and initial fit for the role.
  • Technical interview: A deep dive into your problem-solving approach through a technical challenge, case study, or role-specific scenario.
  • Onsite meeting (optional): You’ll meet team members across functions to explore collaboration dynamics, team fit, and day-to-day context.
  • Final executive conversation: A discussion with a member of the executive team focused on long-term alignment, cultural fit, and shared expectations for impact.
Seniority level
  • Mid-Senior level
Employment type
  • Full-time
Job function
  • Technology, Information and Internet and Hospitals and Health Care

Referrals increase your chances of interviewing at kaiko.ai by 2x

Get notified about new Platform Engineer jobs in Zurich, Switzerland.

Other roles
  • Software Engineer, ML & Data Platforms (100%)
  • Platform Engineer – Observability 100% (f/m/d)
  • Technical Engineer – Data & Cloud | Switzerland
  • Kubernetes Platform Engineer 100% (f/m/d) (Contract through our external payroll partner with immediate start until 31.12.2026 with possible extension)

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.