Job Search and Career Advice Platform

Enable job alerts via email!

Snowflake Architect

N Consulting Ltd

Basildon

On-site

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data consulting firm in the UK is seeking a Snowflake Architect to design and maintain robust data pipelines on AWS. The ideal candidate will have over 5 years of experience in data engineering with a strong focus on Snowflake. Responsibilities include optimizing data warehousing and collaborating with teams to fulfill data requirements. The company offers a contract role located in Basildon, requiring on-site presence five days a week.

Qualifications

  • 5+ years of experience in data engineering, focusing on Snowflake and AWS.
  • Proficiency in SQL, Python, and ETL tools.
  • Experience with AWS services such as S3, Lambda, Redshift, and Glue.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using Snowflake.
  • Implement data warehousing solutions for efficient dataset management.
  • Collaborate with stakeholders to define data requirements.
  • Optimize performance and scalability of the Snowflake data warehouse.

Skills

Data Architecture
Data Migration
Data Modeling
Snowflake Designer / Developer
DBT (Data Build Tool)
ETL Design
AWS Services
StreamSet
Python Programming
Leadership and Team Handling
Strong Communication

Education

Bachelor’s degree in computer science or related field

Tools

Snowflake
AWS
Oracle RDBMS
ETL tools
Airflow
Job description
Overview

Role : Snowflake Architect

Location : Basildon

Work from Client office 5 days weekly

Duration : Contract

Mandatory Skills Required
  • Data Architecture
  • Data Migration
  • Data Modeling
  • Snowflake Designer / Developer
  • DBT (Data Build Tool)
  • ETL Design
  • AWS Services – including S3, ETL / EMR, Security, Lambda, etc.
  • StreamSet
  • Python Programming
  • Leadership and Team Handling
  • Strong Communication and Collaboration Skills
Responsibilities
  • Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
  • Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
  • Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements.
  • Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability.
  • Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake.
  • Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity.
  • Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as AWS
Qualifications
  • Bachelor’s degree in computer science, Engineering, or a related field.
  • 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
  • Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.)
  • Hands on experience with Oracle RDBMS
  • Data Migration experience to Snowflake
  • Experience with AWS services such as S3, Lambda, Redshift, and Glue.
  • Strong understanding of data warehousing concepts and data modeling.
  • Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions.
  • Understanding / hands on experience in Orchestration solutions such as Airflow
  • Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.