Enable job alerts via email!

Senior Data Engineer (Kafka) (Remote - Europe)

Jobgether

United Kingdom

Remote

GBP 52,000 - 79,000

Full time

Today
Be an early applicant

Job summary

A technology firm is seeking a Senior Data Engineer specializing in Kafka, to design and maintain real-time data streaming platforms. This 100% remote position offers the chance to work in an innovative environment, requiring 6+ years of data engineering experience and strong skills in Kafka and Python. Successful candidates will enjoy career progression, a competitive compensation package, and the opportunity to work with international teams.

Benefits

100% remote work setup
Career progression opportunities
International team collaboration
Competitive compensation package
Regular team events

Qualifications

  • 6+ years of experience in data engineering, especially in event-driven systems.
  • At least 3 years of hands-on experience with Kafka.
  • Strong knowledge of event modeling and data formats.

Responsibilities

  • Design and maintain Kafka-based pipelines for event flows.
  • Implement ingestion solutions with Kafka Connect.
  • Enforce data governance with schema validation.

Skills

Experience in data engineering
Expertise in Kafka
Proficiency in Python
Event modeling skills
Communication skills in English
Familiarity with cloud platforms
Experience with observability tools

Tools

Confluent Cloud
Kafka Connect
Docker
Kubernetes
Job description
Overview

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Data Engineer (Kafka) in Europe.

This role offers the opportunity to own and shape a cutting-edge real-time data streaming platform, ensuring reliable and scalable event-driven solutions across complex systems. As a Senior Data Engineer, you will design and operate Kafka pipelines, enforce governance and data quality standards, and collaborate with cross-functional teams to deliver innovative solutions. You\'ll work in a fast-paced, international, and fully remote environment, with opportunities to contribute to transformative digital experiences. If you are passionate about data streaming, scalable architectures, and continuous optimization, this position will give you both impact and growth.

Accountabilities
  • Designing and maintaining Kafka-based pipelines (Confluent Cloud or self-managed) to support scalable, reliable event flows.
  • Architecting topics, partitions, and retention policies while implementing ingestion solutions with Kafka Connect/Debezium.
  • Enforcing data quality and governance with schema validation (Avro/Protobuf), compliance handling (GDPR/CCPA), and monitoring tools.
  • Ensuring high availability, disaster recovery, and secure operations through CI/CD, infrastructure as code, and cost optimization.
  • Collaborating with data, analytics, product, and marketing teams to define event-driven interfaces and service-level agreements.
  • Continuously improving latency, throughput, and reliability while evaluating new streaming tools and emerging best practices.
Requirements

To be successful in this role, you should bring:

  • 6+ years of experience in data engineering, with strong expertise in event-driven systems.
  • At least 3 years of hands-on production experience with Kafka (Confluent Platform/Cloud), including Kafka Connect, Schema Registry, and Kafka Streams/ksqlDB (or Flink/Spark alternatives).
  • Proficiency in Python for building services, tooling, and test frameworks.
  • Strong knowledge of event modeling, data formats (Avro, Protobuf, JSON), idempotency, and DLQ handling.
  • Excellent communication skills in English (C1 level) and experience working in consulting, agency, or client-facing environments.
  • Familiarity with cloud platforms (AWS, Azure, or GCP), container orchestration (Docker/Kubernetes), CI/CD pipelines, and Git.
  • Experience with observability tools (Prometheus, Grafana, Datadog) and incident response processes is highly valued.
  • Additional experience with Kotlin for Kafka Streams, Jupyter for analysis, IaC (Terraform/CloudFormation), CDPs (mParticle), and tag management tools (Tealium) is a strong plus.
Benefits

This opportunity offers:

  • 100% remote work setup within Europe, with flexibility across time zones.
  • A fast-moving, innovative environment where learning and growth are constant.
  • Career progression paths with opportunities for advancement.
  • International exposure, working with teams across Europe, the US, and beyond.
  • A supportive and collaborative work culture that values initiative and creativity.
  • Competitive compensation package with room for growth.
  • Regular virtual and in-person team events to foster strong connections.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.