Enable job alerts via email!

Data Engineer

83zero Limited

London

On-site

GBP 45,000 - 70,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading media and broadcasting organization is seeking a skilled Data Engineer to join the Data & Analytics team on an initial 6-month contract in London. This role is ideal for candidates with strong ETL expertise and experience in Google Cloud Platform, tasked with designing scalable data pipelines and automating infrastructure.

Qualifications

  • Proven experience in data engineering with a strong focus on cloud-based ETL workflows.
  • Solid background with Google Cloud Platform (GCP) and associated data tools.
  • Skilled in Infrastructure as Code - Terraform and Ansible preferred.
  • Confident working with CI/CD pipelines (Jenkins, GitLab CI, GoCD, etc.).
  • Proficient in Python, Go, and shell scripting (BASH).
  • Ability to work 2 days per week onsite in Osterley.

Responsibilities

  • Designing and developing scalable ETL pipelines to process and deliver large volumes of data.
  • Working hands-on with GCP services including BigQuery, Pub/Sub, and Dataflow.
  • Automating infrastructure using Terraform, Ansible, and CI/CD tooling.
  • Writing clean, efficient code in Python, Go, and BASH.
  • Supporting and maintaining a secure Linux-based data engineering environment.
  • Collaborating with stakeholders to ensure data pipelines meet business needs and SLAs.

Skills

ETL expertise
Google Cloud Platform (GCP)
Python
Go
BASH
Terraform
Ansible
CI/CD tools

Job description

Social network you want to login/join with:

83zero are partnered with a leading media and broadcasting organisation on the lookout for a skilled Data Engineer to join their Data & Analytics team on an initial 6-month contract.

This role is perfect for someone with strong ETL expertise, deep experience in Google Cloud Platform (GCP), and a passion for building scalable, cloud-native data pipelines. You'll work with cutting-edge tech in a fast-paced environment, helping to deliver critical insights and analytics to the business.

What You'll Be Doing:

  • Designing and developing scalable ETL pipelines to process and deliver large volumes of data.
  • Working hands-on with GCP services including BigQuery, Pub/Sub, and Dataflow.
  • Automating infrastructure using Terraform, Ansible, and CI/CD tooling.
  • Writing clean, efficient code in Python, Go, and BASH.
  • Supporting and maintaining a secure Linux-based data engineering environment.
  • Collaborating with stakeholders to ensure data pipelines meet business needs and SLAs.

What We're Looking For:

  • Proven experience in data engineering with a strong focus on cloud-based ETL workflows.
  • Solid background with Google Cloud Platform (GCP) and associated data tools.
  • Skilled in Infrastructure as Code - Terraform and Ansible preferred.
  • Confident working with CI/CD pipelines (Jenkins, GitLab CI, GoCD, etc.).
  • Proficient in Python, Go, and shell scripting (BASH).
  • Ability to work 2 days per week onsite in Osterley.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.