Enable job alerts via email!

Lead Data Platform Engineer - AI Foundation

Deblock

City Of London

Remote

GBP 70,000 - 90,000

Full time

12 days ago

Job summary

A fintech and crypto platform based in the UK is looking for a strategic data engineer to build the foundational data platform for their AI-first transformation. The ideal candidate will have strong experience with GCP tools, expert-level SQL skills, and a solid background in building and monitoring data pipelines. The position offers a competitive salary, remote work options, and a high level of autonomy.

Benefits

Competitive salary + stock options
Private dental + health insurance
30 days of paid holidays
Option to work 100% remotely
Ability to work abroad for 4 months a year

Qualifications

  • Strong experience with data infrastructure, especially in GCP.
  • Expert SQL knowledge is mandatory.
  • Proven ability to build and monitor data pipelines.

Responsibilities

  • Build foundational data platform for AI-first transformation.
  • Consolidate data ecosystem into a clean, governed platform.
  • Collaborate closely with CTO and ML team.

Skills

Strong experience with GCP (BigQuery, Dataflow, Cloud Storage, DataStream, AlloyDB/CloudSQL)
Expert-level SQL
Experience with dbt or similar for data modeling and testing
Hands-on with streaming platforms (Kafka, Kafka Connect)
Understanding of CDC tools (Debezium or similar)
Experience building batch and real-time data pipelines
Experience building dimensional models for analytics and ML features
Experience implementing data governance
Experience with data quality monitoring
Understanding of observability
Experience optimising BigQuery performance
Job description
Overview

Strategic, high-impact & high-ownership role building the data foundation for our AI-first fintech & crypto platform.

Build the foundational data platform for Deblock's AI-first transformation. You'll consolidate our data ecosystem (PostgreSQL, Kafka, BigQuery, GCS) into a clean, governed, and ML-ready platform. You'll work closely with our CTO and ML team to enable AI features through robust data infrastructure.

Core Requirements
  • Strong experience with GCP (BigQuery, Dataflow, Cloud Storage, DataStream, AlloyDB/CloudSQL)
  • Expert-level SQL
  • Experience with dbt or similar for data modeling and testing
  • Hands-on with streaming platforms (Kafka, Kafka Connect)
  • Understanding of CDC tools (Debezium or similar)
  • Experience building batch and real-time data pipelines
  • Experience building dimensional models (fact/dimension tables) for analytics and ML features
  • Experience implementing data governance: PII tagging, column-level security, access controls
  • Experience with data quality monitoring with automated checks and alerting
  • Understanding of observability: data freshness monitoring, schema change detection, pipeline health dashboards
  • Experience optimising BigQuery performance (partitioning, clustering, query optimisation)
Nice to Have
  • Experience with Feature Store architecture and ML feature requirements
  • Understanding of real-time vs batch feature serving patterns
  • Prior work with financial services or regulated data environments
  • Familiarity with Vertex AI ecosystem
  • Experience with Apache Beam/Dataflow transformations
  • Background collaborating with ML/data science teams
  • Knowledge of vector databases or semantic search concepts
Benefits
  • Competitive salary + stock options
  • Private dental + health insurance
  • The best tech for your job
  • 30 days of paid holidays (excl. bank holidays)
  • Option to work 100% remotely or come to the office - your choice!
  • Ability to work abroad for 4 months a year
  • Leading position with huge impact, autonomy and ownership
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.