Enable job alerts via email!

Senior Data Engineer

Public Sector Resourcing, managed by AMS

London

Hybrid

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A government department in the UK is seeking a Senior Data Engineer for a hybrid contract based in London. The role focuses on building and improving data pipelines using modern tools. Candidates must have significant experience with Python, SQL, and cloud services. Active security clearance is essential for eligibility. This position aims to drive better data-driven decision-making across the organization.

Qualifications

  • Strong experience designing, building and maintaining data pipelines using modern tools.
  • Proficient in Python and SQL for data engineering tasks.
  • Experience with dbt and understanding data modeling approaches.

Responsibilities

  • Review and improve existing data pipelines.
  • Work closely with performance and data teams to improve data usage.
  • Advocate for data-driven decision-making across the group.

Skills

Data pipeline design and maintenance
Proficiency in Python
Proficiency in SQL
Experience with dbt
Understanding of data modeling approaches
Familiarity with Airflow
Experience with AWS services
Version control and CI/CD tools
Stakeholder engagement
Adaptability and curiosity

Job description

On behalf of The Ministry of Justice, we are looking for a Senior Data Engineer (Inside IR35) for a 4 month hybrid contract based 2 days a week in London or any UK office.

Note: SC Clearance is an essential requirement for this role, as a minimum you must be willing & eligible to undergo checks. Please note, due to the exceptional requirements of this position, (and speed at which we require a postholder in situ), preference may be given to candidates who meet all of the essential criteria and hold active security clearance

You'll join the STG Performance Team, which is tackling a fundamental challenge across our Directorate General. We're addressing this by developing two things: a modern, user-centred framework for measuring performance and organisational health, covering everything from delivery and finance to people, risk and user satisfaction, and a dashboard for senior leaders to surface these metrics and use them to inform collective decision-making.

We've already built an MVP dashboard in Amazon QuickSight, now in active use. We're now rebuilding it using a flexible, code-first approach and using this team as a testbed for smarter digital delivery, applying product operating model principles, working in lean and cross-functional ways, and embedding AI-native workflows using tools like ChatGPT, GitHub Copilot and agentic assistants.

A key focus for this role will be strengthening our data foundations. That means reviewing and improving our existing pipelines, some of which are still manual, and replacing them with more modern, automated and consistent approaches using our Analytical Platform. We want to reduce the burden on teams to collect and share data, retrieve more from source, and drive better conversations about data quality, completeness and fitness for purpose. We see dashboards not just as reporting tools, but as a way to spark a virtuous cycle of improvement, showing the current state of data (even when it's rough), surfacing what matters most, and helping leaders take informed action.

You'll play a central role in that cycle. You'll work closely with colleagues across performance, data, digital and portfolio teams to improve how we collect, structure and use data, and you'll be an advocate for data that drives better decisions and more joined-up leadership across the group.

Skills and Experience
* Strong experience designing, building and maintaining data pipelines using modern, cloud-based tools and practices
* Proficiency in Python and SQL for data engineering tasks
* Experience with dbt and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling)
* Familiarity with Airflow or similar orchestration tools
* Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure
* Experience using version control and CI/CD tools like Git and GitHub Actions
* Confident working independently and taking ownership of problems and solutions
* Comfortable engaging directly with stakeholders, both technical and non-technical, to understand needs, share ideas and shape solutions
* Willingness to work closely with users, gather feedback, and iterate quickly to improve products and data quality
* Adaptable, curious, and open to experimenting with new tools, techniques and ways of working, including AI coding assistants and lean, cross-functional approaches

Please be aware that this role can only be worked within the UK and not Overseas.


Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.