Enable job alerts via email!

Senior Data Engineer

Additional Resources Ltd

City of Westminster

On-site

GBP 60,000 - 80,000

Part time

6 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A well-established biotech company in the UK is looking for a Senior Data Engineer to develop and optimise scalable data pipelines using Azure cloud technologies. The role offers remote working options and a contract length of 6-12 months with a competitive daily rate of £500 - £650. Candidates should have strong skills in Python, Kubernetes, and data engineering practices, contributing to innovative data solutions in healthcare.

Qualifications

  • Previous experience in data engineering roles.
  • Highly skilled in Python for automation.
  • Familiarity with lakehouse technologies.

Responsibilities

  • Design and implement cloud-based data architectures.
  • Build scalable data pipelines for high-volume processing.
  • Automate infrastructure with Infrastructure-as-Code tools.

Skills

Azure cloud platforms
Python
Kubernetes
Docker
Terraform
Ansible
PostgreSQL

Tools

Spark
Databricks
Grafana
Prometheus
New Relic
Job description

An opportunity has arisen for a Senior Data Engineer to join a well-established biotech company using large-scale genetic data and AI to predict disease risk and advance precision healthcare.

As a Senior Data Engineer, you will be responsible for developing, automating, and optimising scalable data pipelines using modern cloud technologies.

This is a 6-12 month contract based role with hybrid / remote working options offering a salary of £500 - £650 per day (Inside IR35) and benefits.

You Will Be Responsible For:
  • Designing and implementing cloud-based data architectures using Azure services.
  • Building robust and scalable data pipelines to support complex, high-volume processing.
  • Deploying and managing containerised workloads through Kubernetes, Helm, and Docker.
  • Automating infrastructure using Infrastructure-as-Code tools such as Terraform and Ansible.
  • Ensuring system reliability through observability, monitoring, and proactive issue resolution.
  • Collaborating with cross-functional teams to align data solutions with wider business needs.
  • Supporting the continuous improvement of processes, deployment, and data quality standards.
What We Are Looking For:
  • Previously worked as a Senior Data Engineer, Data Engineer, Data Platform Engineer, Data Architect, Data Infrastructure Engineer, Cloud Data Engineer, DataOps Engineer, Data Pipeline Engineer, DevOps Engineer, or in a similar role.
  • Proven experience with Azure cloud platforms and related architecture.
  • Highly skilled in Python for data engineering, scripting, and automation.
  • Strong working knowledge of Kubernetes, Docker, and cloud-native data ecosystems.
  • Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible).
  • Hands‑on experience with PostgreSQL and familiarity with lakehouse technologies (e.g., Apache Parquet, Delta Tables).
  • Exposure to Spark, Databricks, and data lake/lakehouse environments.
  • Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing.
  • Practical experience monitoring live services using tools such as Grafana, Prometheus, or New Relic.

This is an excellent opportunity to play a key role in shaping innovative data solutions within a forward‑thinking organisation.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.