Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer

LANCH

Berlin

Vor Ort

EUR 50.000 - 70.000

Vollzeit

Vor 18 Tagen

Zusammenfassung

LANCH, a high-growth food tech startup based in Berlin, is seeking a Data Engineer to establish its data infrastructure. In this role, you will design and implement critical data pipelines, collaborate with product and engineering teams, and contribute to innovative food brand solutions. The ideal candidate has experience in cloud environments, is proficient in Python and SQL, and enjoys translating complex business challenges into data-driven solutions. Join our dynamic team and help shape the future of food delivery.

Qualifikationen

  • 2+ years building data infrastructure.
  • Professional experience in ELT pipelines.
  • Fluent in Python and capable of writing performant SQL.

Aufgaben

  • Architect and launch scalable event-streaming platforms.
  • Build and maintain Reverse ETL layers.
  • Collaborate with backend engineers to enhance analytics.

Kenntnisse

Data infrastructure
Python
SQL
Airflow
Docker
GCP
Collaboration

Tools

BigQuery
Apache Kafka
Terraform

Jobbeschreibung

About the role

About LANCH
LANCH, the fastest-growing consumer company in DACH, is seeking a talented and motivated Data Engineer to join our dynamic team.
Founded in 2023 and headquartered in Berlin, LANCH partners with restaurants and top creators to launch delivery-first food brands such as Happy Slice pizza, Loco Chicken, and the new Korean-style Koco Chicken. Beyond virtual kitchens, we are rolling out a network of physical restaurants and retail brands (“Happy Chips”, “Loco Tortillas”) that already reach thousands of supermarkets. Backed by €26 million in Series A funding (Feb 2025), our Tech & Data team is building the platforms - LANCH OS and the Partner Portal - that power everything from menu management to supply-chain automation.

The Role
We’re looking for our first Data Engineer to lay the foundations of LANCH’s end-to-end data platform. You’ll own everything that turns operational events into trusted, analysis-ready datasets - from real-time streaming and batch pipelines to the orchestration frameworks that keep them humming. Working hand-in-hand with product, engineering, and ops, you will design and implement the data infrastructure that powers menu optimisation, delivery routing, brand performance dashboards, and much more.

Key Responsibilities

  • Architect and launch a scalable event-streaming platform (e.g., Pub/Sub, Kafka) that captures orders, logistics updates, and app interactions in real time.
  • Build and maintain a modern Reverse ETL layer (e.g., Census, Hightouch) to push clean warehouse data back to internal applications like our Partner Portal, LANCH OS, or our CRM.
  • Evolve our Airflow and ELT environment: modular DAG design, automated testing, CI/CD, observability, and cost-efficient GCP execution.
  • Collaborate with backend engineers to instrument services for analytics & tracing; champion event naming conventions and schema governance.
  • Set engineering standards - code reviews, documentation, security, and infra as code (Terraform) - that will scale as we 10x the team and data volume.

About you

About You – what will make you thrive at LANCH

  • 2+ years building data infrastructure in cloud environments.
  • Professional experience in designing and developing ELT pipelines.
  • Hands-on experience with at least one streaming technology (Pub/Sub, Kafka, Kinesis, Dataflow, ...).
  • Fluent in Python for data processing; comfortable writing performant SQL (BigQuery dialect a plus).
  • Proven track record orchestrating pipelines with Airflow (or Dagster, Prefect) and deploying via Docker & GitHub Actions.
  • Product mindset: you enjoy sitting with ops teams or restaurant managers to translate fuzzy business challenges into robust data pipelines.
  • Bias for action and ownership: you prototype quickly, measure impact, and iterate - yesterday’s idea should be today’s scheduled DAG.
  • Collaborative communicator - fluent English; conversational German.
  • Eager to work mostly on-site in our Berlin Prenzlauer Berg office.

Our Tech Stack
  • Data Warehouse: BigQuery
  • Transformation & Modelling: dbt, SQL
  • Orchestration: Airflow
  • Streaming / Messaging: Google Pub/Sub, Apache Kafka (greenfield)
  • Backend & APIs: Python, FastAPI, SQLModel, PostgreSQL
  • Infrastructure: GCP, Terraform, Docker, GitHub Actions
  • Analytics & BI: Metabase, Pandas, Notebook-based exploration
  • Reverse ETL: Census, Hightouch, ... (greenfield)
If parts of the stack are new to you, no worries - what matters most is your drive to learn fast and build data products that power thousands of meals a day.

If shaping the data foundation of a high-growth food tech startup excites you, we’d love to meet you.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.