Enable job alerts via email!

Senior Data Platform Engineer

New Day

London

On-site

GBP 50,000 - 80,000

Full time

16 days ago

Job summary

A leading company in London is seeking a skilled Data Engineer to develop a Snowflake ecosystem, ensure process automation, and support data analytics integration. The ideal candidate will have expertise in Python, SQL, and AWS infrastructure, along with knowledge of DevOps principles. This permanent position offers an exciting opportunity to work with modern data technologies and contribute to a collaborative team environment.

Qualifications

  • Proficient knowledge of Snowflake ecosystems and Airflow.
  • Strong coding capabilities in Python and SQL.
  • Familiarity with AWS infrastructure and DevOps practices.

Responsibilities

  • Design and manage a Snowflake data ecosystem for enhanced data access.
  • Automate processes and manage user permissions in Snowflake.
  • Support analytics integration with tools like PowerBI and Tableau.

Skills

Snowflake
Python
SQL
Airflow
Docker
DevOps tools
Terraform
Distributed systems
Cloud computing

Job description

What You'll Deliver in the First 0-12 Months
  • Design and Build: Create and manage a Snowflake data ecosystem to support event and batch workflows, enabling end-user teams to access data, perform transformations, build reports, and leverage automation tools.
  • Process Automation: Ensure processes are automated, observable, and compliant with platform standards.
  • Schema Management: Enhance and maintain a Squitch project for Snowflake schema management.
  • Library Maintenance: Maintain and enhance a Python library for end-user data access in Snowflake and S3.
  • Administration: Manage Snowflake and associated tooling accounts, including user and permission management (IAM, AD).
  • Airflow Support: Collaborate with users to write and support Airflow DAGs hosted in Astronomer Cloud.
  • Streaming Data: Work within a streaming data ecosystem, such as Kafka.
  • Analytics Integration: Integrate visualisation and analytics tools like PowerBI and Tableau.
Skills and Experience

Essential Skills
  • Proficient knowledge of Snowflake (Columnar databases), Jupyter notebooks, Airflow, and Docker.
  • Strong Python and SQL coding abilities.
  • Familiarity with distributed systems, APIs, and cloud computing patterns.
  • Understanding of visualisation/containerisation.
  • Experience with EKS/Kubernetes and DevOps tools, particularly GitHub Actions.
  • Understanding of AWS infrastructure and services (S3, EKS, IAM, Storage Gateway, Athena).
  • Proficient in Terraform or a solid understanding of Infrastructure as Code (IaC).
Desirable Skills
  • Expertise in Snowflake concepts such as resource monitors, RBAC controls, virtual warehouse setup, query performance tuning, and time travel.
  • Experience with Kafka and DBT.
  • Understanding of Data Vault design patterns and cloud/network configurations (Firewalls, Proxies).
  • Familiarity with AWS networking (VPC, VPN, Direct Connect, Route 53, Auto Scaling, CloudWatch, GuardDuty).
We work with Textio to make our job design and hiring inclusive.

Permanent
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.