Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Retelligence

England

Hybrid

GBP 80,000 - 100,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading digital innovation firm in London is seeking a Senior Data Engineer to build and maintain a high-performance data system. This hybrid role requires expertise in Google Cloud Platform and Python. The engineer will lead the design of scalable data pipelines and ensure data quality and security. Ideal candidates will have a proven track record in Data Engineering and experience with distributed technologies such as Kafka. Excellent communication skills are essential for collaboration and technical leadership.

Qualifications

  • Extensive hands-on experience in Data Engineering.
  • Proven track record architecting production-grade real-time data pipelines.
  • Deep expertise in Google Cloud Platform (GCP) essential.

Responsibilities

  • Lead the design and development of scalable real-time data pipelines.
  • Ensure seamless integration of diverse data sources across the organization.
  • Build and optimize robust data models for analytical use cases.

Skills

Data Engineering
Production-grade data pipelines
Google Cloud Platform (GCP)
Kafka
Python
SQL

Tools

Apache Airflow
Kubernetes
Job description

Senior Data Engineer

Job Title : Senior Data Engineer

Salary Range : £80,000–£100,000

Location / Working model : London | Hybrid

This is a critical technical position within a high-growth, forward-thinking organization that specializes in digital innovation. We require an experienced Senior Data Engineer to design, build, and maintain a robust, scalable, and high-performance system that is fundamental to the organization's operational and analytical success.

Core Responsibilities

The successful candidate will own the technical development and strategic execution of the core data platform, ensuring high quality, performance, and security:

  • System Architecture & Development : Lead the design, development, and maintenance of scalable, high-performance, real-time data pipelines and core infrastructure components within the cloud environment.
  • Data Integration : Ensure seamless, low-latency data flow by integrating and unifying diverse data sources across the organization.
  • Data Modeling & Quality : Build and optimize robust data models specifically for complex querying and analytical use cases, and ensure data quality, integrity, and security across all systems.
  • Reliability & Monitoring : Construct highly available, fault-tolerant systems for high-volume data ingestion and processing, implementing effective monitoring, logging, and alerting mechanisms.
  • Performance Optimization : Continuously monitor, tune, and optimize pipeline scalability and performance to meet stringent throughput and latency standards.
  • Cross-Functional Alignment : Partner directly with technical and business teams to ensure the data infrastructure directly supports strategic objectives.
Required Technical Expertise

This role demands a deep, production-level background. Candidates must demonstrate proficiency in the following essential areas:

  • Production Experience : Extensive hands-on experience in Data Engineering, specifically a proven track record of successfully architecting, building, and managing production-grade real-time data pipelines across multiple large-scale initiatives.
  • Cloud Platform Specialization : Mandatory deep, practical experience leveraging Google Cloud Platform (GCP) and its associated tools for the deployment and management of real-time data ingestion and processing services. (While AWS knowledge is valuable, GCP expertise is primary.)
  • Distributed Streaming : Strong familiarity with distributed streaming technologies, such as Kafka or similar platforms.
  • Programming & Scripting : Expert proficiency in Python for development, coupled with the ability to optimize and refactor data pipelines for improved performance and scalability.
  • Database Expertise : Advanced knowledge of SQL is mandatory, alongside a strong understanding of various database technologies (NoSQL, time-series databases) and deep competence in data modeling principles.
  • Workflow Orchestration : Strong working knowledge of industry-standard data orchestration tools (e.g., Apache Airflow or Kubernetes).
The Environment

We are seeking a highly collaborative professional with strong problem-solving skills, capable of thriving in a fast-paced environment. Exceptional written and verbal communication skills in English are required for effective technical leadership and documentation.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.