Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Data Engineer Snowflake & Data Sharing (dfm, Berlin)

Monda Labs

Berlin

Hybrid

EUR 75.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A data sharing software company in Berlin is searching for a Senior Data Engineer to join their engineering team. The role involves developing cross-cloud data sharing features, ensuring data flow efficiency, and collaborating with teams to align technical solutions. Candidates should have comprehensive experience in Snowflake, Python, and data pipeline engineering, with a base salary starting at 75K gross per annum. This position supports a hybrid work environment.

Qualifikationen

  • 5 years of hands-on experience as a Data Engineer or similar role.
  • Expert knowledge of Snowflake and Python.
  • Proficiency in data orchestration frameworks such as Prefect.

Aufgaben

  • Enable customers to connect diverse data sources and publish data products.
  • Ensure solid data flows by operating Python pipelines on AWS ECS.
  • Design and maintain our Snowflake-based cross-cloud data platform.

Kenntnisse

Snowflake
Python
Data Pipeline Engineering
AWS
Docker
Data Orchestration

Ausbildung

Degree in Computer Science or related field

Tools

Terraform
Prefect
Apache Airflow
Jobbeschreibung

We build data product sharing software which fuels AI.

Monda believes that any company should be able to share and access the data they need to fuel AI. Therefore we create a borderless data sharing ecosystem to fuel the AI revolution and accelerate human progress. We encourage and empower any company in the world to share and monetize their data safely.

Our Engineering Team :

We are a passionate multicultural engineering team dedicated to turning complex data challenges into seamless software always deciding acting and delivering with the customer experience at the core.

Our Data Tech Stack :

Snowflake Prefect Python Django AWS Terraform Cloudflare Docker Github Heroku

Our Tech Challenges :
  • Simplify cross-cloud data product creation : Enable easy onboarding of data sources from multiple cloud environments and ensure reliable data delivery for true cross-cloud sharing supporting seamless data marketplace integrations across AWS GCP Azure Snowflake and Databricks.
  • Fuel the AI revolution : Streamline data customization and multi-asset data product management empowering integrations with leading data marketplaces such as Datarade Snowflake Marketplace Databricks Google Cloud Analytics Hub and SAP Datasphere. Drive innovation in data platforms : Tackle the challenges of scalability reliability and performance in a rapidly evolving multi-cloud ecosystem while enabling business-ready high-quality data products.
Tasks

We're looking for a Senior Data Engineer (d / f / m) to join our software engineering team in Berlin. As an individual contributor (IC) you'll work closely with our Head of Engineering. You'll not only develop new cross-cloud data sharing features from scratch but also improve the stability of the platform and all data pipelines.

The start date is 1st of January 2026 work location is Berlin (hybrid) base salary starts at 75K gross / annum (based on experience).

What you'll do :
  • Enable customers to connect diverse data sources and publish marketplace-ready data products fueling seamless data exchange and monetization
  • Ensure solid observable and efficient data flows by operating Python pipelines orchestrated with Prefect and running on AWS ECS
  • Connect and unify data across multiple cloud environments enabling secure and high-performance data exchange between diverse customer systems and platforms
  • Collaborate across teams to align technical solutions with customer and business needs driving engineering excellence and platform reliability
  • Design develop and maintain our Snowflake-based cross-cloud data platform with scalable and future-proof architecture
Requirements
  • 5 years of hands-on experience as a Data Engineer or in a similar role building and maintaining large-scale data systems in Snowflake environment
  • Expert knowledge of Snowflake and Python with experience designing efficient scalable and secure data pipelines
  • Proficiency in data orchestration frameworks such as Airflow Dagster or dbt experience with Prefect is a strong plus
  • Solid understanding of containerization and cloud infrastructure particularly with Docker and AWS ECS
  • Proven experience in data platform architecture data delivery and data lifecycle best practices
  • Strong sense of ownership and accountability with the ability to prioritize assess criticality and deliver results under minimal supervision
  • Experience working in agile cross-functional teams embracing iterative development and continuous improvement
Bonus qualifications :
  • Degree in Computer Science Information Systems Application Programming or a related technical field
  • Hands-on experience with Infrastructure as Code (IaC) tools such as Terraform
  • Background in international B2B software applications ideally within the e-commerce industry
  • In-depth knowledge of multiple cloud service providers (e.g. AWS GCP Azure) and experience working in cross-cloud environments
  • Genuine passion for Data Engineering with additional experience in web application development or adjacent software domains

Wed love to hear from you! Apply now and expect a fast transparent hiring process : a quick intro call a focused code challenge and conversations with the team and founders.

Key Skills

Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala

Employment Type : Employee

Experience : years

Vacancy : 1

Yearly Salary Salary : 75000 - 75000

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.