Job Search and Career Advice Platform

Enable job alerts via email!

Senior Snowflake Data Engineer - Remote - £competitive

Jefferson Frank

Greater London

Remote

GBP 75,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A data solutions company is seeking a Senior Snowflake Data Engineer to join their team. The role offers the opportunity to design and optimize high-performance data pipelines, involving technologies like Snowflake and dbt. Candidates should have over 7 years in data engineering, with strong SQL and Python skills. The successful applicant will work in a collaborative environment focused on engineering best practices, alongside an attractive salary and professional development support.

Benefits

Professional development and certification support
Collaborative, engineering-focused culture
Competitive salary and benefits package

Qualifications

  • 7+ years in data engineering roles.
  • 3+ years hands-on experience with Snowflake.
  • 2+ years production experience with dbt (mandatory).

Responsibilities

  • Design, develop, and optimise scalable data pipelines in Snowflake.
  • Build and maintain dbt models with robust testing and documentation.
  • Implement CI/CD pipelines and manage deployments via Git.

Skills

Data engineering experience
Snowflake
dbt
Advanced SQL
Python programming
Git and CI/CD
ETL/ELT tools
Cloud platforms (AWS, Azure)

Tools

Snowflake
dbt
Airflow
Tableau
Power BI
Job description

Senior Snowflake Data Engineer - Remote - £competitive

About the Role

We are looking for an experienced Senior Snowflake Data Engineer to join a dynamic team working on cutting-edge data solutions. This is an exciting opportunity to design, build, and optimise high-performance data pipelines using Snowflake, dbt, and modern engineering practices. If you are passionate about data engineering, test‑driven development, and cloud technologies, we'd love to hear from you.

Key Responsibilities
  • Design, develop, and optimise scalable data pipelines in Snowflake.
  • Build and maintain dbt models with robust testing and documentation.
  • Apply test‑driven development principles for data quality and schema validation.
  • Optimise pipelines to reduce processing time and compute costs.
  • Develop modular, reusable transformations using SQL and Python.
  • Implement CI/CD pipelines and manage deployments via Git.
  • Automate workflows using orchestration tools such as Airflow or dbt Cloud.
  • Configure and optimise Snowflake warehouses for performance and cost efficiency.
Required Skills & Experience
  • 7+ years in data engineering roles.
  • 3+ years hands‑on experience with Snowflake.
  • 2+ years production experience with dbt (mandatory).
  • Advanced SQL and strong Python programming skills.
  • Experience with Git, CI/CD, and DevOps practices.
  • Familiarity with ETL/ELT tools and cloud platforms (AWS, Azure).
  • Knowledge of Snowflake features such as Snowpipe, streams, tasks, and query optimisation.
Preferred Qualifications
  • Snowflake certifications (SnowPro Core or Advanced).
  • Experience with dbt Cloud and custom macros.
  • Exposure to real-time streaming (Kafka, Kinesis).
  • Familiarity with data observability tools and BI integrations (Tableau, Power BI).
What We Offer
  • Opportunity to work with modern data technologies and large-scale architectures.
  • Professional development and certification support.
  • Collaborative, engineering‑focused culture.
  • Competitive salary and benefits package.
Interested?

Apply now with your CV highlighting your Snowflake, dbt, and DevOps experience.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.