Enable job alerts via email!

GCP Data Engineer - London - £75k +bonus

Jefferson Frank

City Of London

On-site

GBP 75,000 - 87,000

Full time

30+ days ago

Job summary

A leading Lloyd's of London reinsurance broker is on the lookout for a skilled GCP Data Engineer to join their data team. In this role, you will focus on designing data pipelines, leading cloud migrations, and leveraging your strong skills in GCP, SQL, and Python. You will also help train junior staff and work with emerging AI technologies. With a competitive salary and a collaborative environment, this role offers a great opportunity for career progression.

Qualifications

  • Proven experience as a Data Engineer in a commercial environment.
  • Strong hands-on experience with GCP services (e.g., BigQuery, Dataflow).
  • Advanced SQL skills and proficiency in Python.

Responsibilities

  • Design, build, and maintain scalable data pipelines in Google Cloud Platform.
  • Lead migration from Azure to GCP for seamless data integration.
  • Write clean SQL and Python code for data transformation.

Skills

GCP
SQL
Python
Azure

Tools

Terraform

Job description

GCP Data Engineer - London - £75k +bonus

Please note - this role will require you to attend the London based office 2-3 days per week. To be considered for this role you must have the unrestricted right to work in the UK - this organisation can not offer sponsorship.

Are you a skilled Data Engineer with a passion for cloud technologies and a strong foundation in GCP, Azure, SQL, and Python? A leading Lloyd's of London reinsurance broker is seeking a talented individual to join their growing data team and help shape the future of their cloud data platform.

Key Responsibilities:

* Design, build, and maintain scalable data pipelines and ETL processes in Google Cloud Platform (GCP).
* Lead and contribute to a major cloud migration project from Azure to GCP, ensuring seamless data integration and minimal disruption.
* Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
* Integrate data from various sources, ensuring data quality, consistency, and security.
* Write clean, efficient, and well-documented SQL and Python code for data transformation and automation.
* Assist in the utilisation and adoption of emerging AI technology usage.
* Assist in training and mentoring more junior members of staff.

Required Skills & Experience:

* Proven experience as a Data Engineer in a commercial environment.
* Strong hands-on experience with Google Cloud Platform (GCP) services (e.g., BigQuery, Dataflow, Pub/Sub).
* Solid understanding of Azure data services and hybrid cloud environments.
* Advanced SQL skills and proficiency in Python for data engineering tasks.
* Experience working in or with insurance, reinsurance, or financial services is a strong advantage.

Desirable:

* Familiarity with data governance, security, and compliance in regulated industries.
* Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform).
* Knowledge of data modelling and warehousing best practices.

Interviews for this role are beginning this week so please apply today if you'd like to be considered! To apply, please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com.

Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs