Job Search and Career Advice Platform

Enable job alerts via email!

AWS Data Engineer

TXP

Remote

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A public sector technology firm is recruiting a Data Engineer focused on AWS to contribute to a critical project. This role requires a candidate with strong back-end and data engineering experience, particularly with AWS ETL or Azure services. The successful candidate must hold valid SC Clearance. The contract offers between £450 and £500 per day, with remote working and occasional site visits, and is anticipated to last until the end of March 2026, with potential for extension.

Qualifications

  • Experience in back-end / data engineering across multiple languages.
  • Proficiency in AWS ETL or Azure ETL services.
  • Familiarity with querying data on AWS S3 or Azure ADLSv2.

Responsibilities

  • Work on a project within the Public Sector.
  • Develop and maintain data processing jobs.
  • Automate operational tasks using scripting languages.

Skills

Back-end / data engineering experience
Python
AWS ETL or Azure ETL services
AWS S3 or Azure ADLSv2
API-level and Database connectivity
Gitlab
Terraform or similar cloud products
Data feature development
Automation with scripting languages
Job description

Role: Data Engineer (AWS)
Contract: 450pd- 500pd (Inside IR35)
Location: Remote working with occasional travel to site
Duration: End of March 2026 (expected to extend at the new financial year)

We are currently recruiting for a Data Engineer to work on a project within the Public Sector space. The role will be AWS focused and requires a Data Engineer who can come in and make an impact and difference to the project.

Skills and experience required
  • Experience of back-end / data engineering across a number of languages (including Python), and commonly used IDE's
  • Experience with developing, scheduling, maintaining and resolving issues with batch or micro-batch jobs on AWS ETL or Azure ETL services
  • Experience querying data stored on AWS S3 or Azure ADLSv2, or through a Lakehouse capability
  • Experience in managing API-level and Database connectivity
  • Experience using source control and DevOps tooling such as Gitlab
  • Experience in use of terraform (or similar cloud native products) to build new data & analytics platform capabilities
  • Experience with developing data features and associated transformation procedures on a modern data platform. Examples include (but not limited to) Azure Fabric, AWS Lakeformation, Databricks or Snowflake.
  • Experience automating operations tasks with one or more scripting languages.

Due to the nature of the project and the short turnaround required, the successful candidate must hold valid and live SC Clearance.

If you are interested in the role and would like to apply, please click on the link for immediate consideration.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.