Enable job alerts via email!

Data Engineer – Snowflake Platform Lead

None

United States

Remote

USD 120,000 - 165,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading technology firm seeks an experienced Data Engineer specializing in Snowflake to join their remote team. The chosen candidate will architect, develop, and optimize data pipelines and ensure robust integration with BI and ML teams. Candidates with a strong background in data engineering, along with exceptional SQL proficiency and AWS experience, are encouraged to apply.

Benefits

Competitive salary and performance bonus
Comprehensive health, dental, and vision insurance
401(k) with company match
Flexible work arrangements
Professional development opportunities
Collaborative work environment

Qualifications

  • 10+ years of professional experience in Data Engineering and AWS.
  • 3+ years implementing and tuning Snowflake data warehouses.
  • Deep proficiency in SQL, query optimization, and analytical queries.

Responsibilities

  • Design and build high-performance ELT workflows in Snowflake.
  • Implement cost-efficient schemas and optimize compute/storage.
  • Configure role-based access control and manage secure data sharing.

Skills

Data Engineering
SQL
Snowflake
AWS
Performance Optimization
Collaboration

Education

Bachelor's in Computer Science or related field

Tools

SnowSQL
dbt
Apache Airflow
AWS Glue

Job description

  • Location: Remote (EST)
  • Employment Type: Full-Time (No-immigration support)
  • Experience Level: 10+ Years in Data Engineering, with Snowflake expertise
Position Overview

We’re seeking a skilled Data Engineer to architect, develop, and optimize Snowflake‐powered data pipelines within a modern cloud data platform. In this role, you will lead ELT workflow design, implement cost‐efficient warehouse models, and enforce secure data access through advanced RBAC and data-masking policies. You’ll collaborate closely with BI and ML teams to ensure seamless integration, high performance, and optimal storage/compute utilization.

Key Responsibilities
  • ELT Pipeline Development: Design and build high-performance, modular ELT workflows in Snowflake using SnowSQL, dbt, Airflow, or AWS Glue.
  • Warehouse Modeling: Implement cost-efficient schemas (star, snowflake, or hybrid), clustering, and data partitioning strategies to optimize compute and storage.
  • Performance Tuning: Analyze query performance; apply optimizations such as micro-partition pruning, result caching, and warehouse sizing.
  • Security & Governance: Configure role-based access control (RBAC), dynamic data masking, and object-level permissions; manage secure data sharing and Snowflake Data Marketplace integrations.
  • Cloud Integration: Interface Snowflake with AWS services (S3, Lambda, IAM); automate data ingestion and event-driven processes.
  • BI & ML Enablement: Collaborate with analytics and data science teams to integrate BI tools (e.g., Looker, Tableau, Power BI) and support ML pipelines with feature tables and Snowpark.
  • Monitoring & Troubleshooting: Implement observability (query history, resource monitors); troubleshoot production issues and enforce SLAs.
  • Best Practices & Documentation: Establish coding standards, version control, and CI/CD for data projects; author clear technical documentation and runbooks.
Required Qualifications
  • 10+ years of professional experience in Data Engineering and AWS
  • Snowflake Expertise: 3+ years implementing and tuning Snowflake data warehouses, SnowSQL, and Snowflake-specific features.
  • SQL Mastery: Deep proficiency in SQL, query optimization, and analytical query patterns.
  • Orchestration Tools: Hands-on experience with dbt, Apache Airflow, or AWS Glue for workflow automation.
  • Cloud Fundamentals: Familiarity with AWS S3, Lambda functions, and IAM for secure, serverless data workflows.
  • Performance & Cost Optimization: Proven ability to partition data, size virtual warehouses, and apply caching strategies.
  • Security Practices: Experience configuring RBAC models, dynamic masking, and secure data sharing in Snowflake.
  • Integration Skills: Track record of connecting Snowflake to BI platforms and supporting ML feature engineering.
  • Collaboration & Communication: Strong written and verbal skills; comfortable gathering requirements and presenting technical solutions to stakeholders.
Preferred Qualifications
  • Certifications in Snowflake (SnowPro Core) or AWS
  • Experience with Snowflake’s Snowpark API for Python, Scala, or Java
  • Familiarity with data catalog and governance tools (e.g., Collibra, Alation)
  • Background in agile development and CI/CD for data projects (Jenkins, GitLab CI/CD)
  • Prior work in regulated industries (finance, healthcare, insurance)
Benefits
  • Competitive salary and performance bonus
  • Comprehensive health, dental, and vision insurance
  • 401(k) with company match
  • Flexible work arrangements
  • Professional development and continuous learning opportunities
  • Collaborative and innovative work environment

Our company is an Equal Opportunity Employer committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.

Application Process:Qualified candidates should submit their resume, cover letter, and a brief description of relevant projects.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.