Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

National Audit Office

Greater London

Hybrid

GBP 55,000 - 65,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A prominent public sector audit organization is seeking a Data Engineer for its Digital Services team. You will design, build, and maintain data infrastructure, ensuring data accessibility and quality. Responsibilities include developing scalable data pipelines and collaborating with diverse teams to support analytics. Candidates should have experience in ETL processes, data integration, and database management. This position is based in London or Newcastle with flexible working options.

Benefits

Civil Service Employer Pension Contribution of 28.9%
Flexible working arrangements

Qualifications

  • Demonstrated experience in ETL and Data Pipeline Development.
  • Proven ability to implement data flows between operational systems and analytics platforms.
  • Experience managing relational and non-relational databases.

Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Integrate structured and unstructured data from internal and external sources.
  • Collaborate across teams with analytics engineers, data scientists, and stakeholders.

Skills

Communicating between the technical and non-technical
Data Analysis and Synthesis
Data Development Process
Data Innovation
Data Integration Design
Data Modelling
Metadata Management
Problem Management
Programming and Build (Data Engineering)
Technical Understanding
Testing
Job description

Job description

Role: Data Engineer

Contract: Permanent

Location: London or Newcastle

Salary: c65, plus Civil Service Employer Pension Contribution of 28.9%

Nationality Requirements:

  • UK nationals
  • Nationals of Commonwealth countries who have the right to work in the UK
  • Nationals from the EU, EEA or Switzerland with (or eligible for) status under the European Union Settlement Scheme

(EUSS)

Please note, we are not able to sponsor work visas. Please contact us atshould you have any questions on your nationality eligibility.

The closing date for applications is 11.59pm 18 Jan . First stage interviews over MS Teams will take place between WC 25 January . Second stage interviews will take place at our offices in Victoria 2 / 3 February.

About the National Audit Office

The National Audit Office (NAO) is the UK’s main public sector audit body. Independent of government, we have responsibility for auditing the accounts of various public sector bodies, examining the propriety of government spending, assessing risks to financial control and accountability, and reviewing the economy, efficiency and effectiveness of programmes, projects, and activities. We report directly to Parliament, through the Committee of Public Accounts of the House of Commons which uses our reports as the basis of its investigations. We employ approx. 1, people, most of whom are qualified accountants, trainees, or technicians. The organisation comprises two service lines : financial audit, and value for money (VFM) audit and has a strong core of highly talented corporate teams.

The NAO welcomes applications from everyone. We value diversity in all its forms and the difference it makes to our organisation. By removing barriers and creating an inclusive culture all our people can develop and maximise their full potential. As members of the Business Disability Forum and the Disability Confident Scheme we guarantee to interview all disabled applicants who meet the minimum criteria.

The NAO supports flexible working and is happy to discuss this with you at application stage.

Context and main purpose of the job
Introduction

This is a new vacancy created within NAO’s Digital Services (DS) to expand the data service team, with responsibility for designing, building, and maintaining the infrastructure that enables robust data collection, storage, and access across the organization. This role supports the development and continual improvement of NAO data & technology service composition and provision, enabling scalable and reliable data solutions.

In this capacity, you will build and optimize data pipelines, integrate diverse data sources, and ensure the efficient movement of data across systems. You will work closely with analytics engineers, data scientists, and other stakeholders to ensure data is accessible, high-quality, and fit for purpose. Your work will underpin the NAO’s ability to derive insights and automate processes using corporate and client data.

In this role, you will
  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Integrate structured and unstructured data from internal and external sources.
  • Ensure data quality, consistency, and security across systems.
  • Collaborate with analytics engineers and subject matter experts to support data modelling and transformation.
  • Monitor and optimize performance of data infrastructure.
  • Document data architecture and engineering processes to ensure transparency and maintainability.

This role reports into the Head of Data Services.

This role requires regular attendance at the NAO’s office either in Victoria, London, or at the office in Newcastle.

Responsibilities of the role

As a data engineer at the NAO, you will play a critical role in building and maintaining the technical foundation that enables data-driven operations and insights. You will be responsible for architecting and managing data infrastructure, ensuring that data flows securely and efficiently across systems, and enabling downstream users to access reliable, well-structured data.

Your key responsibilities will include
  • Building scalable data infrastructure: Design and implement systems that support the ingestion, storage, and processing of large volumes of structured and unstructured data from internal and external sources.
  • Developing robust data pipelines: Create automated workflows that extract, transform, and load data into centralized platforms, ensuring consistency, reliability, and performance across all stages.
  • Designing and optimizing ETL processes: Build and maintain efficient ETL (Extract, Transform, Load) workflows to move data from source systems into usable formats. Ensure these processes are scalable, well-documented, and aligned with data quality standards.
  • Integrating diverse data sources: Connect and harmonize data from various systems (e.g., operational databases, APIs, cloud services) to create unified datasets for analysis and reporting.
  • Collaborating across teams: Work closely with analytics engineers, data scientists, and business stakeholders to understand data needs and deliver infrastructure that supports analytical and operational use cases.
  • Ensuring data reliability and performance: Monitor data systems for latency, failures, and bottlenecks. Implement performance tuning and system optimizations to maintain high availability and responsiveness.
  • Implementing data governance and security protocols: Apply best practices for data privacy, access control, and compliance. Ensure that sensitive data is protected and handled in accordance with regulatory requirements.
  • Maintaining technical documentation: Produce and update documentation for data architecture, pipeline configurations, and operational procedures to support transparency and continuity.
  • Troubleshooting and incident response: Investigate and resolve data-related issues, from pipeline failures to data integrity concerns. Establish proactive monitoring and alerting systems.
  • Supporting data accessibility: Enable self-service access to clean, well-organised data for analysts and other users through tools, APIs, or data platforms.
  • Keeping pace with technology: Stay informed about emerging tools, frameworks, and methodologies in data engineering. Continuously evaluate and adopt innovations that improve efficiency and scalability.
Key skills / competencies required

The skill sets listed also include the corresponding skill level (awareness, working, practitioner, expert) :

  • Communicating between the technical and non-technical (Skill level : Awareness) You can explain why it's important to communicate technical concepts in non-technical language. You understand the types of communication used with internal and external stakeholders and their impact.
  • Data Analysis and Synthesis (Skill level : Working) You can undertake data profiling and source system analysis. You present clear insights to colleagues to support the end use of the data.
  • Data Development Process (Skill level : Working) You can design, build, and test data products based on feeds from multiple systems, using a range of storage technologies and access methods. You create repeatable and reusable products.
  • Data Innovation (Skill level : Awareness) You show awareness of opportunities for innovation with new tools and uses of data.
  • Data Integration Design (Skill level : Working) You deliver data solutions in accordance with agreed organisational standards that ensure services are resilient, scalable, and future-proof.
  • Data Modelling (Skill level : Working) You understand the concepts and principles of data modelling. You can produce, maintain, and update relevant data models and reverse-engineer models from live systems.
  • Metadata Management (Skill level : Working) You use metadata repositories to complete complex tasks such as data and systems integration impact analysis. You maintain metadata repositories to ensure accuracy and currency.
  • Problem Management (Skill level : Awareness) You investigate problems in systems, processes, and services, and contribute to the implementation of remedies and preventative measures.
  • Programming and Build (Data Engineering) (Skill level : Working) You can design, code, test, correct, and document simple programs or scripts under direction. You follow agreed standards and tools.
  • Technical Understanding (Skill level : Working) You understand core technical concepts related to the role and apply them with guidance.
  • Testing (Skill level : Working) You review requirements and specifications, define test conditions, identify issues and risks, and report test activities and results.
Experience Requirements
  • ETL and Data Pipeline Development: Demonstrated experience in designing, building, and maintaining ETL workflows and data pipelines. Skilled in extracting, transforming, and loading data from various sources into centralized platforms.
  • Data Infrastructure and Integration: Proven ability to implement data flows between operational systems and analytics platforms. Experience with cloud-based data services (e.g., AWS, Azure, GCP) and streaming systems is desirable.
  • Database Management and Optimization: Experience managing relational and non-relational databases, including performance tuning, indexing, and query optimization. Familiarity with database design principles and data warehousing solutions.
  • Collaboration and Communication: Ability to work effectively with technical and non-technical stakeholders. Skilled in translating business requirements into technical solutions and supporting cross-functional teams.
  • Problem Solving and Troubleshooting: Capable of identifying and resolving data-related issues, implementing preventative measures, and contributing to system reliability.
How to Apply

Please upload a CV and a covering letter outlining your suitability and interest in the role before the deadline.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.