Job Search and Career Advice Platform

Enable job alerts via email!

Lead AWS Data Engineer

Opus Recruitment Solutions

Tyseley

On-site

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A recruitment agency is seeking a Lead AWS Data Engineer to join a finance client in the UK. The role involves building and optimizing reporting workflows, leading data-sourcing initiatives, and upgrading existing pipelines in a collaborative environment. Applicants must have a strong background in data engineering, hands-on AWS experience, and knowledge of Java, Spark, and Snowflake. This is a contract position requiring on-site presence in Birmingham for 12 months.

Qualifications

  • Strong background in data engineering with distributed data environments.
  • Hands-on expertise with AWS, Spark, Glue, and Snowflake.
  • Experience building and optimizing data pipelines & reporting workflows.

Responsibilities

  • Build a new reporting workflow and upgrade existing pipelines.
  • Lead a full data-sourcing uplift across multiple workflows.
  • Work closely with internal engineering and controls teams.

Skills

Data engineering with distributed data environments
Hands-on expertise with AWS
Experience with Spark
Proficiency in Glue
Knowledge of Snowflake
Building and optimizing data pipelines
Experience leading engineering teams

Tools

AWS Glue
Apache Spark
Snowflake
Java
Job description

Lead AWS Data Engineer | Birmingham | Finance | AWS | Java | Outside IR35 | Contract | 12 Months

Opusispartneredwithfinancialservicesclienttodeliveramajorprogrammeofwork.

You’ll join an established engineering group, working alongside internal teams to build a new reporting workflow, upgrade an existing pipeline, and lead a full data‑sourcing uplift across multiple reporting workflows. The team will also be responsible for helping upgrade the framework for two critical internal workflows. This role will require you to be on site in either the London or Birmingham office 5 days per week, please only apply if you hit this criteria.

Only apply if you hit this criteria.

Required Experience
  • Strong background in data engineering with distributed data environments
  • Hands‑on expertise with AWS, Spark, Glue, and Snowflake
  • Experience building and optimising data pipelines & reporting workflows
  • Ability to work closely with internal engineering and controls teams
  • Experience upgrading or modernising existing workflows and frameworks
  • For Lead level: prior experience leading engineering teams or workstreams
Tech Stack
  • AWS (Glue, S3, Lambda, Step Functions)
  • Apache Spark
  • Snowflake
  • Java

If you are interested in this role then please apply here or email me your most recent and up to date CV, along with your availability to (url removed).

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.