Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer – GCP / DSS

DCV Technologies

Greater London

Hybrid

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology services provider is seeking a skilled Data Engineer for a hybrid role in London. You will design and maintain data pipelines using Big Query and work closely with data scientists and engineers. Strong knowledge of GCP, Python, and SQL is essential. This position offers a competitive salary and opportunities for both contract and permanent placements.

Qualifications

  • Experience designing data models and developing industrialised data pipelines.
  • Strong knowledge of database and data lake systems.
  • Hands-on experience in Big Query, dbt, GCP cloud storage.

Responsibilities

  • Design, build, optimise and maintain production-grade data pipelines.
  • Work with teams to utilise new internal and external data sources.
  • Create frameworks and systems to manage data assets.

Skills

Data model design
Data pipeline development
Big Query
Python
SQL
Terraform
Tableau Cloud
Agile methodologies

Tools

GCP
dbt
Cloud SQL
Airbyte
Dagster
Job description

Job Title : Data Engineer – GCP / DSS

Department : Enabling Functions

Location : Hybrid, London

Type : Both Contract (Inside IR35) & Permanent available

Salary : Competitive; depends on experience and open to discussion

Purpose of Job
What you will be working on

While our broker platform is the core technology crucial to success – this role will focus on supporting the middle / back‑office operations that will lay the foundations for further and sustained success.

We're a multi‑disciplined team, bringing together expertise in software and data engineering, full stack development, platform operations, algorithm research, and data science. Our squads focus on delivering high‑impact solutions – we favour a highly iterative, analytical approach.

You will be designing and developing complex data processing modules and reporting using Big Query and Tableau. In addition, you will also work closely with the Infrastructure / Platform Team, responsible for architecting and operating the core of the Data Analytics platform.

Principle Accountabilities

Work with both the business teams (finance and actuary initially), data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP / Big Query.

Work with finance, actuaries, data scientists and engineers to understand how we can make best use of new internal and external data sources.

Work with our delivery partners at EY / IBM to ensure robustness of design and engineering of the data model / MI and reporting which can support our ambitions for growth and scale.

BAU ownership of data models, reporting and integrations / pipelines.

Create frameworks, infrastructure and systems to manage and govern data assets.

Produce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc.

Work with the broader Engineering community to develop our data and MLOps capability infrastructure.

Ensure data quality, governance, and compliance with internal and external standards.

Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy

Regulatory Conduct and Rules
  1. Act with integrity
  2. Act with due skill, care and diligence
  3. Be open and co‑operative with Lloyd’s, the FCA, the PRA, and other regulators
  4. Pay due regard to the interests of customers and treat them fairly
  5. Observe proper standards of market conduct
Education, Qualifications, Knowledge, Skills and Experience
  • Experience designing data models and developing industrialised data pipelines.
  • Strong knowledge of database and data lake systems.
  • Hands‑on experience in Big Query, dbt, GCP cloud storage.
  • Proficient in Python, SQL and Terraform.
  • Knowledge of Cloud SQL, Airbyte, Dagster.
  • Comfortable with shell scripting with Bash or similar.
  • Experience provisioning new infrastructure in a leading cloud provider, preferably GCP.
  • Proficient with Tableau Cloud for data visualization and reporting.
  • Experience creating DataOps pipelines.
  • Comfortable working in an Agile environment, actively participating in approaches such as Scrum or Kanban
Desirable Skills
  1. Experience of streaming data systems and frameworks would be a plus.
  2. Experience working in regulated industry, especially financial services, would be a plus.
  3. Experience creating MLOps pipelines is a plus

The applicant must also demonstrate the following skills and abilities

Excellent communication skills (both oral and written).

Pro‑active, self‑motivated and able to use own initiative.

Excellent analytical and technical skills.

Ability to quickly comprehend the functions and capabilities of new technologies.

Ability to offer balanced opinion regarding existing and future technologies.

How to Apply

If you are interested in the Data Engineer – GCP / DSS position, please apply here

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.