Enable job alerts via email!

Lead Data Architect / Engineering Consultant

ZipRecruiter

City Of London

On-site

GBP 70,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A leading recruitment platform is seeking an experienced Lead Data Architect / Engineering Consultant to deliver a robust cloud data platform. The role involves end-to-end project ownership, designing scalable architectures, and mentoring internal teams. Ideal candidates will have extensive experience with Azure technologies and a strong track record in delivering enterprise-scale solutions.

Qualifications

  • Extensive experience delivering enterprise-scale data platforms in cloud environments.
  • Deep expertise in lake house architectures and telemetry/IoT pipelines.
  • Proven ability to lead complex delivery across technical and non-technical teams.

Responsibilities

  • Lead the end-to-end delivery of the data & analytics platform.
  • Define the architectural vision and implementation roadmap.
  • Mentor and upskill internal data engineers and analysts.

Skills

Azure Event Hubs
IoT Hub
Data Factory
SQL
Python
Delta Lake
Spark
Real-time processing
Telemetry pipelines
Batch processing
Job description

Job Description

Our client is seeking an experienced Lead Data Architect / Engineering Consultant to deliver the cloud- data platform. This platform will serve as the backbone for ingesting, analysing, and visualising operational data – enabling real-time monitoring, alerting, and long-term trend analysis. You will work across disciplines (engineering, analytics, asset operations, trading) to design a scalable, secure, and high-performance platform that supports current needs and enables future growth.

The key responsibilities are the following:

Project Ownership:

  • Lead the end-to-end delivery of the data & analytics platform, from discovery through production handover.
  • Define the architectural vision, design decisions, and implementation roadmap; coordinate closely with internal stakeholders.
  • Facilitate technical discovery sessions with the data, trading, and operations teams to capture cross-functional requirements.
  • Ensure that documentation, codebases, and data models are structured to support maintainability and reuse beyond the project.

Architecture & Platform Design

  • Design a modular, multi-tenant architecture to accommodate diverse asset telemetry and operational datasets.
  • Select appropriate Azure technologies and apply scalable data patterns for ingestion, processing, storage, and access.
  • Incorporate Medallion architecture principles with time-series and event data optimised for both real-time and historical analytics.
  • Ensure interoperability with existing platforms (BI tools, trading models, performance dashboards).

Engineering & Implementation

  • Build robust batch and streaming pipelines using Azure Data Factory, Databricks, and Spark Structured Streaming.
  • Implement secure, scalable storage using Delta Lake formats, Azure Data Lake Gen2, and time-series solutions (e.g. Azure Data Explorer).
  • Deliver data validation, schema evolution, metadata management, and lineage tracking to production quality.

Mentoring & Capability Building

  • Actively mentor and upskill internal data engineers and analysts throughout the project.
  • Establish coding standards, design patterns, and operational best practices tailored to the internal capabilities.
  • Support the Analytics team in developing a roadmap for sustained platform ownership and ongoing enhancements.
  • Where appropriate, help integrate platform components with broader trading and forecasting infrastructure.

Governance, Security & Handover

  • Define and implement access control, encryption, and cost monitoring policies aligned with Azure best practices.
  • Produce complete documentation for infrastructure, pipelines, operational procedures, and support processes.
  • Provide transition and onboarding support to ensure the internal team can own and evolve the platform post-delivery.

Required Skills & Experience

  • Extensive experience delivering enterprise-scale data platforms in cloud environments (preferably Azure).
  • Deep expertise in lake house architectures, real-time/batch processing, and telemetry/IoT pipelines.
  • Proven ability to lead complex delivery across technical and non-technical teams.
  • Strong hands-on experience with:
  • Azure Event Hubs, IoT Hub, Data Factory, Data Lake Gen2, Synapse, Databricks
  • Spark, Structured Streaming, Delta Lake, Parquet
  • SQL, Python
  • Time-series DBs (Azure Data Explorer, InfluxDB, TimescaleDB)
  • Familiarity with Medallion architecture, schema evolution, cost optimisation, and system observability.
  • Track record of mentoring internal teams and embedding best practices in technical delivery.
  • Experience in energy, grid operations, or industrial IoT highly desirable.

FTC: 6 to 12 months. There is flexibility to extend the scope of this engagement to review and upgrade the wider data infrastructure supporting Analytics and Trading functions.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.