Enable job alerts via email!

Data Engineer

JAM Recruitment Ltd

London

Remote

GBP 100,000 - 125,000

Full time

2 days ago
Be an early applicant

Job summary

A recruitment agency is seeking an experienced Data Engineer to work on a high-profile program, primarily from home. The successful candidate will design ETL workflows, ensure data quality, and develop data platforms using AWS tools. Applicants must hold a valid Enhanced SC Security Clearance. Competitive pay of up to £745 per day is offered.

Qualifications

  • Proficient in Python and SQL for data processing.
  • Solid experience with Apache Airflow – writing and configuring DAGs.
  • Strong AWS skills (S3, Redshift, etc.).
  • Big data experience with Apache Spark.

Responsibilities

  • Build and refine ETL/ELT workflows using Apache Airflow.
  • Create reliable ingestion processes from APIs and internal systems.
  • Develop and maintain data lakes and warehouses.
  • Implement automated validation, testing, and monitoring for data integrity.

Skills

Python
SQL
Apache Airflow
AWS
Apache Spark
Docker
Kubernetes
Kafka
Data Modelling

Job description

eSC Cleared Data Engineer


Bristol or London (Mostly working from home)


Up to £745 per day (Umbrella, Inside IR35)


Must hold live and transferrable Enhanced SC Security Clearance


About the Role

We're seeking an experienced Data Engineer to join a high-profile programme, working with cutting-edge cloud and data technologies. You'll be instrumental in building, optimising, and maintaining scalable data pipelines and platforms that underpin mission-critical systems.


If you thrive on solving complex data challenges, working with modern orchestration tools, and applying best practices in security and compliance, this role offers both technical depth and impact.


Key Responsibilities

  1. Design & Optimise Pipelines - Build and refine ETL/ELT workflows using Apache Airflow for orchestration.
  2. Data Ingestion - Create reliable ingestion processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services.
  3. Cloud Data Platforms - Develop and maintain data lakes and warehouses (e.g., AWS S3, Redshift).
  4. Data Quality & Governance - Implement automated validation, testing, and monitoring for data integrity.
  5. Performance & Troubleshooting - Monitor workflows, enhance logging/alerting, and fine-tune performance.
  6. Data Modelling - Handle schema evolution, partitioning strategies, and efficient data structures.
  7. CI/CD for Data - Work with DevOps teams to integrate pipelines into robust CI/CD processes, managing version control for DAGs and configurations.
  8. Security & Compliance - Apply encryption, access control (IAM), and GDPR-aligned data practices.
Technical Skills & Experience
  • Proficient in Python and SQL for data processing.
  • Solid experience with Apache Airflow - writing and configuring DAGs.
  • Strong AWS skills (S3, Redshift, etc.).
  • Big data experience with Apache Spark.
  • Knowledge of data modelling, schema design, and partitioning.
  • Understanding of batch and streaming data architectures (e.g., Kafka).
  • Experience with Docker and/or Kubernetes.
  • Familiarity with Microsoft SQL Server Stack (SQL Server, SSIS, C#, T-SQL), Elastic/OpenSearch.
Interested?

If you're ready to bring your data engineering expertise to a technically challenging and rewarding programme, apply now with your CV.

Job Info

Job Title:

Company: JAM Recruitment Ltd

Location:

Posted: Aug 14th 2025

Closes: Sep 14th 2025

Sector: Military, Emergency & Government

Contract: Contract

Hours: Full Time

Welcome to Fresh Jobs the place to find the freshest job vacancies and career advice.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs