Enable job alerts via email!

Senior Data Engineer (contract)

Methods

Ledbury

On-site

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A leading IT Services Consultancy in the UK is seeking a Senior Data Engineer to develop and manage data pipelines, implement hybrid cloud solutions and ensure data security. The ideal candidate has extensive experience with Python, ETL/ELT workflows, and a passion for data. You will drive the integration of on-premises and cloud data solutions to empower business leaders with actionable insights.

Benefits

Hybrid work model
Opportunity for professional growth
Collaborative team environment

Qualifications

  • Expertise in Python for scripting and automating data processes.
  • Proven experience in developing and optimising ETL/ELT workflows in hybrid on-premises and Azure environments.
  • Knowledge of integrating on-premises infrastructure with Azure cloud services.
  • Experience with Docker, GitHub and Kubernetes across on-premises and cloud platforms.
  • Practical experience with Azure Data Factory in appropriate environments.
  • Experience deploying event streaming platforms like Kafka and NATS.
  • Experience implementing security practices and tools, including Keycloak, across multiple platforms.
  • Strong background in Elasticsearch and PostgreSQL across on-premises and cloud infrastructures.

Responsibilities

  • Design, construct, and maintain efficient data pipelines using Python/Go/Azure Data Factory.
  • Implement and manage data storage solutions leveraging both on-premises infrastructure and Azure.
  • Use Docker for containerisation and Kubernetes for orchestration.
  • Automate data flows and manage complex workflows.
  • Utilise event-driven technologies like Kafka and NATS to handle real-time data streams.
  • Manage security setups, incorporating tools like Keycloak.
  • Design and develop PostgreSQL databases with high performance.

Skills

Strong Python Skills
Experience with ETL/ELT
Hybrid Cloud Data Architecture
Containerisation and Orchestration Expertise
Workflow Automation Tools
Event Streaming
Data Security Knowledge
Search and Database Development

Tools

Azure Data Factory
PostgreSQL
Docker
Kubernetes
Kafka
NATS
Keycloak
Elasticsearch
Job description
Overview

Senior Data Engineer (contract)

Location: On-site

Type: Full time

Company: Methods Business and Digital Technology Limited

Methods is a £100M+ IT Services Consultancy working with central government departments and agencies to transform the public sector in the UK. We are UK-based, established for over 30 years, and deliver end-to-end business and technical solutions that are people-centred, safe, and designed for the future. Our human-centric approach focuses on people, technology and data to create value and sustainable outcomes for clients, staff, communities, and the planet. We support clients in project success through collaborative problem solving, and we value learning from mistakes. Predominantly public-sector, Methods is building a significant private sector client portfolio and was acquired by the Alten Group in early 2022.

Requirements

On-site, Full time.

This role will require you to have ACTIVE Security Clearance, with a willingness to move to DV

Key Responsibilities
  • Develop and Manage Data Pipelines: Design, construct, and maintain efficient data pipelines using Python/Go/Azure Data Factory to support both streaming and batch processing across structured, semi-structured, and unstructured data in on-premises and Azure environments.
  • Hybrid Cloud and Data Storage Solutions: Implement and manage data storage solutions leveraging both on-premises infrastructure and Azure, ensuring seamless data integration and accessibility across platforms.
  • Containerisation and Orchestration: Use Docker for containerisation and Kubernetes for orchestration to enable scalable deployment across cloud-based and on-premises environments.
  • Workflow Automation: Automate data flows and manage complex workflows using tools such as Azure Data Factory within hybrid environments.
  • Event Streaming Experience: Utilise event-driven technologies like Kafka and NATS to handle real-time data streams.
  • Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms.
  • Database Development: Design and develop PostgreSQL databases with high performance and availability across deployment scenarios.
Essential Skills and Experience
  • Strong Python Skills: Expertise in Python for scripting and automating data processes.
  • Experience with ETL/ELT: Proven experience in developing and optimising ETL/ELT workflows in hybrid on-premises and Azure environments.
  • Hybrid Cloud Data Architecture: Knowledge of integrating on-premises infrastructure with Azure cloud services.
  • Containerisation and Orchestration Expertise: Experience with Docker, GitHub and Kubernetes across on-premises and cloud platforms.
  • Workflow Automation Tools: Practical experience with Azure Data Factory in appropriate environments.
  • Event Streaming: Experience deploying event streaming platforms like Kafka and NATS.
  • Data Security Knowledge: Experience implementing security practices and tools, including Keycloak, across multiple platforms.
  • Search and Database Development: Strong background in Elasticsearch and PostgreSQL across on-premises and cloud infrastructures.
Your Impact

In this role, you will empower business leaders to make informed decisions by delivering timely, accurate, and actionable data insights from a robust, hybrid infrastructure. You will drive seamless integration of on-premises and cloud-based data solutions, enhancing the flexibility and scalability of data operations. You will champion modern data architectures and tooling, mentor team members, and advance engineering practices to cultivate a data-driven culture within the organisation.

Desirable Skills and Experience
  • Certifications in Azure and Other Relevant Technologies: Certifications in cloud and on-premises technologies are beneficial and strengthen your application.
  • Experience in Data Engineering: A minimum of 5 years of data engineering experience, with exposure to managing infrastructure in both on-premises and cloud settings.
  • DevOps Engineering Experience: Some DevOps experience would be preferable.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.