Senior Data Engineer

Sei unter den ersten Bewerbenden.
Nur für registrierte Mitglieder
Berlin
EUR 55.000 - 90.000
Sei unter den ersten Bewerbenden.
Vor 3 Tagen
Jobbeschreibung

ABOUT THE TEAM:

We're a passionate team, consisting of a Head of Data, an Analytics Engineer, and a Senior Data Analyst dedicated to all things data. You'll be at the center of our expanding Data team. We are enthusiastic about exploring new tools and technologies, and thrive on brainstorming to solve challenges and apply best practices. Our team is set to grow soon, welcoming Data Engineers, Analytics Engineers, Analysts, and Data Scientists, all playing pivotal roles in Talon.One's development.

ABOUT THE ROLE:

As a Senior Data Engineer / Data Ops, you will be responsible for engineering tasks related to infrastructure, data ingestion, and DWH pipelines. You will own our data ingestion workflows (currently based on Kafka streams) and ensure that data from our PostgresDB instances, which handle high data volume, is ingested live into BigQuery. Additionally, you will manage our overall DWH infrastructure, recommend improvements, and ensure adherence to best practices. You will also contribute to Data Engineering pipelines, assisting in modeling various data sources to support decision-making across departments.

YOUR TASKS AND RESPONSIBILITIES:

  • Own data ingestion streams for all client databases, from Postgres to BigQuery.
  • Manage the overall data infrastructure: setup, maintenance, and improvement of our stack with new technologies for efficiency and best practices, e.g., advanced DWH pipeline scheduling.
  • Develop and optimize DWH pipelines, applying best practices to enhance efficiency.
  • Collaborate within a data team comprising various profiles, participate in infrastructure decisions, and work closely with stakeholders from R&D and DevOps teams.

YOUR QUALIFICATIONS:

  • 3-5 years of experience in Data Engineering / Data Ops, preferably with large data warehouse projects.
  • Familiarity with the modern data stack, capable of recommending, setting up, and maintaining components from data ingestion to automation.
  • Experience with tools like Kafka, Airflow, DBT Core, BigQuery, Lambda functions, etc.
  • Knowledge of infrastructure/DevOps technologies such as Kubernetes, Docker, Postgres, GCP.
  • Commitment to data engineering best practices: data consistency checks, testing data reliability, monitoring pipelines, avoiding duplicate business logic.

ABOUT TECHNOLOGY:

  • Google Cloud Platform (GKE, Cloud Functions, Pub/Sub, etc.)
  • PostgreSQL
  • ArgoCD and Helm charts for GitOps and delivery
  • Hashicorp ecosystem (Terraform, Vault, etc.)
  • Kafka and BigQuery for DataOps
  • Go for APIs and Custom Kubernetes Operators
  • React.js, TypeScript, and CSS modules for web applications

WHAT'S IN IT FOR YOU:

  • 60+ team members including engineers, product managers, and designers in Berlin
  • Leadership with 8+ years of experience in building our promotions engine
  • Learning budget and access to LinkedIn Learning
  • 30 vacation days
  • Remote work abroad up to 90 days
  • In-house German language courses
  • Discounted Urban Sports Club membership and BVG ticket
  • Work-life balance; your dog is more than welcome!
  • Mental health support through Nilo.health