Enable job alerts via email!

Snowflake Architect

ZipRecruiter

Basildon

On-site

GBP 70,000 - 90,000

Full time

24 days ago

Job summary

A leading recruiting platform is seeking a Snowflake Architect to design and maintain data pipelines using Snowflake on AWS in Basildon. The ideal candidate will have over 5 years of experience in data engineering, with strong skills in data migration, ETL design, and AWS services. This role requires effective communication and collaboration with cross-functional teams, ensuring high-quality data solutions.

Qualifications

  • 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
  • Proficiency in SQL, Python, and ETL tools.
  • Data Migration experience to Snowflake.

Responsibilities

  • Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
  • Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
  • Collaborate with stakeholders to define and fulfill data requirements.

Skills

Data Architecture
Data Migration
Data Modeling
Snowflake Designer/Developer
DBT (Data Build Tool)
ETL Design
AWS Services
StreamSet
Python Programming
Leadership and Team Handling
Strong Communication and Collaboration Skills

Education

Bachelor's degree in computer science, Engineering, or a related field

Tools

SQL
ETL tools
Oracle RDBMS
Airflow
Job description
Overview

Role: Snowflake Architect

Location: Basildon, UK/Dublin, Ireland

Work from Client office 5 days weekly

Mandatory Skills
  • Data Architecture
  • Data Migration
  • Data Modeling
  • Snowflake Designer/Developer
  • DBT (Data Build Tool)
  • ETL Design
  • AWS Services - including S3, ETL/EMR, Security, Lambda, etc.
  • StreamSet
  • Python Programming
  • Leadership and Team Handling
  • Strong Communication and Collaboration Skills
Responsibilities
  • Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
  • Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
  • Collaborate with data analysts, scientists, and other stakeholders to define and fulfil data requirements.
  • Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability.
  • Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake.
  • Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity.
  • Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as AWS
Qualifications
  • >Bachelor's degree in computer science, Engineering, or a related field.
  • >5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
  • >Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.)
  • >Hands on experience with Oracle RDBMS
  • >Data Migration experience to Snowflake
  • >Experience with AWS services such as S3, Lambda, Redshift, and Glue.
  • >Strong understanding of data warehousing concepts and data modelling.
  • >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions.
  • >Understanding/hands on experience in Orchestration solutions such as Airflow
  • >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability
Evaluation / Area of Assessment

Area of Assessment

Priority

  • Data Architect — Must Have
  • Data Migration — Must Have
  • Data Modeling — Must Have
  • DBT Knowledge — Should Have
  • ETL Design — Must Have
  • Snowflake Designer/Developer — Must Have
  • AWS (S3, ETL/EMR, Security, Lambda etc ) — Must Have
  • Leadership/Team handling — Must Have
  • Communication and Collaboration — Must Have
  • StreamSet — Should Have
  • Python — Should Have
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.