Job Search and Career Advice Platform

Enable job alerts via email!

Junior / Mid Level Data Engineer - Inside IR35 - SC Cleared

SR2

United Kingdom

Hybrid

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology firm seeks a Junior/Mid Level Data Engineer to support a government data transformation initiative. This role involves designing and optimizing secure data workflows while working with AWS services like S3 and Glue. Ideal candidates will have strong Python and SQL skills and thrive in a hybrid working environment. The position offers a competitive daily rate and an opportunity to contribute to a significant, long-term project.

Qualifications

  • Strong Data Engineering experience within AWS environments.
  • Hands-on experience with core AWS data services like S3, Glue, and Lambda.
  • Proficiency in Python and SQL for data transformations.

Responsibilities

  • Design, develop and maintain scalable cloud-native data pipelines.
  • Implement ETL/ELT processes for managing data securely.
  • Communicate progress, risks, and trade-offs to stakeholders.

Skills

Data Engineering experience
AWS environment expertise
Proficiency in Python
Proficiency in SQL
Communication skills

Tools

AWS S3
AWS Glue
AWS Lambda
AWS Athena
Terraform
GitLab
Job description
Junior / Mid Level Data Engineer - SC Cleared

Inside IR35: £450 - £500 per day

Hybrid: Once a week in London

Start date: 5th Jan

We are supporting a major government data transformation initiative focused on strengthening the use of evidence-based insights across frontline and operational teams. As part of a new capability being built to process and analyse sensitive interview information, the programme requires a SFIA 3 (Junior - Mid Level) Data Engineer to design, deliver, and optimise secure backend data workflows.

This work is foundational: building the ingestion, orchestration, storage, and transformation layers that power the analytics tool.

The programme is just kicking off, and this is a great time to join, add value, and grow throughout a long-term programme.

Key Responsibilities
  • Design, develop and maintain scalable cloud‑native data pipelines
  • Implement ETL/ELT processes to manage structured and unstructured data securely and efficiently
  • Ensure data integrity, traceability and compliance across all pipeline stages
  • Work with cross‑functional teams to define technical requirements and design decisions
  • Apply DevOps best practices, monitoring, and automation to improve reliability
  • Support continuous improvement of the platform's performance and operational maturity
  • Communicate progress, risks and trade‑offs clearly to wider delivery stakeholders
Required Skills & Experience
  • Strong Data Engineering experience within AWS environments
  • Hands‑on experience with core AWS data services: S3, Glue, Lambda, Athena, Kinesis, Step Functions (or similar)
  • Proficiency in Python and SQL for data transformations and automation
  • Experience with IaC and CI/CD tooling (Terraform, GitLab, etc.)
  • Comfortable working with sensitive datasets and secure‑by‑design approaches
  • Strong communication skills and a proactive, consulting mindset
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.