Job Search and Career Advice Platform

Enable job alerts via email!

Junior Snowflake Data Engineer

Luxoft

Remote

GBP 50,000 - 70,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company in the UK is looking for a skilled Snowflake Data Engineer to design and optimize data pipelines across AWS, Azure, and GCP. The role requires strong expertise in Snowflake, SQL, and ETL tools. You will collaborate with data analysts and business teams to deliver reliable data solutions and ensure data quality and compliance standards are met. Familiarity with Python and cloud certifications is a plus. This position offers an exciting opportunity for innovation and growth in the pharmaceutical sector.

Qualifications

  • 2+ years of experience in a related field.
  • Strong proficiency in Snowflake data warehousing features and performance tuning.
  • Experience in writing complex SQL queries and stored procedures.

Responsibilities

  • Design and optimize scalable data pipelines using Snowflake.
  • Develop, maintain and monitor data models and transformations.
  • Collaborate with business teams to understand data requirements.

Skills

Snowflake
SQL
ETL Tools
Python
Data governance
Data quality frameworks
CI/CD practices

Tools

DBT
Tableau
AWS Glue
Terraform
Azure DevOps
Job description

Project description

Overview

The project is for one of the world's famous science and technology companies in pharmaceutical industry, supporting initiatives in AWS, AI and data engineering, with plans to launch over 20 additional initiatives in the future. Modernizing the data infrastructure through the transition to Snowflake as a priority, as it will enhance capabilities for implementing advanced AI solutions and unlock numerous opportunities for innovation and growth. We are seeking a highly skilled Snowflake Data Engineer to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD practices, with a deep understanding of data governance, security, and role-based access control (RBAC). Knowledge of data modeling methodologies (OLTP, OLAP, Data Vault 2.0), data quality frameworks, Stream lit application development and SAP integration and infrastructure-as-code with Terraform is essential. Experience working with different file formats such as JSON, Parquet, CSV, and XML is highly valued.

Responsibilities
  • In-depth knowledge of Snowflake's data warehousing capabilities.
  • Understanding of Snowflake's virtual warehouse architecture and how to optimize performance and cost.
  • Proficiency in using Snowflake's data sharing and integration features for seamless collaboration.
  • Develop and optimize complex SQL scripts, stored procedures, and data transformations.
  • Work closely with data analysts, architects, and business teams to understand requirements and deliver reliable data solutions.
  • Implement and maintain data models, dimensional modeling for data warehousing, data marts, and star/snowflake schemas to support reporting and analytics.
  • Integrate data from various sources including APIs, flat files, relational databases, and cloud services.
  • Ensure data quality, data governance, and compliance standards are met.
  • Monitor and troubleshoot performance issues, errors, and pipeline failures in Snowflake and associated tools.
  • Participate in code reviews, testing, and deployment of data solutions in development and production environments.
Must have
  • 2+ years of experience
  • Strong proficiency in Snowflake (Snowpipe, RBAC, performance tuning).
  • Ability to write complex SQL queries, stored procedures, and user-defined functions.
  • Skills in optimizing SQL queries for performance and efficiency.
  • Experience with ETL/ELT tools and techniques, including Snowpipe, AWS Glue, openflow, fivetranor similar tools for real-time and periodic data processing.
  • Proficiency in transforming data within Snowflake using SQL, with Python being a plus.
  • Strong understanding of data security, compliance and governance.
  • Experience with DBT for database object modeling and provisioning.
  • Experience in version control tools, particularly Azure DevOps.
  • Good documentation and coaching practice.
Nice to have
  • Cloud certifications is a plus
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.