Enable job alerts via email!

Lead Data Engineer (Snowflake)

Allata

Buenos Aires (AZ)

Remote

USD 120,000 - 150,000

Full time

Today
Be an early applicant

Job summary

A global consulting firm is seeking a Lead Data Engineer to own and evolve the Snowflake data platform. This role involves defining architecture, mentoring engineers, and delivering data solutions. Candidates should have hands-on Snowflake experience, strong SQL and Python skills, and familiarity with Agile methodologies. Join a dynamic team that values technical leadership and innovative solutions to complex data challenges.

Qualifications

  • Deep hands-on Snowflake experience across various functionalities.
  • Strong SQL and Python skills for data engineering.
  • Experience with modern data ingestion tools and orchestration.

Responsibilities

  • Define and evolve Snowflake platform architecture.
  • Lead and mentor data engineers.
  • Manage delivery using Agile methodologies.

Skills

Snowflake expertise
SQL proficiency
Python for data engineering
ELT-first development
Cloud experience
Data ingestion and orchestration

Tools

Fivetran
Airflow
Terraform
Job description

Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices.

Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships.

We’re looking for a hands-on Lead Data Engineer to own and evolve our clients’ Snowflake data platform. You’ll lead and mentor a squad of data engineers, shape the technical roadmap, and deliver high-quality, cost-efficient, and secure data solutions. This role blends architecture, delivery leadership, and coding—ideal for someone who can set standards, coach others, and still dive deep when needed.
n

Role & Responsibilities:

  • Define and evolve Snowflake platform architecture, standards, and best practices.
  • Translate business goals into a pragmatic technical roadmap and delivery plan.
  • Lead and mentor data engineers; establish quality bars, review code, and guide execution.
  • Manage delivery using Agile rituals; align priorities across data, analytics/BI, and application teams.
  • Design, build, and optimize ELT pipelines on Snowflake with an ELT-first approach (dbt preferred).
  • Implement modern data ingestion using tools such as Fivetran, ADF, Glue, or Matillion.
  • Set up orchestration and CI/CD for data using Airflow or Dagster and Git-based pipelines.
  • Ensure data quality, observability, monitoring, alerting, documentation, and runbooks.
  • Apply performance tuning and cost optimization in Snowflake, including query profiling and warehouse sizing.
  • Implement security and governance in Snowflake (RBAC, masking, row access policies, auditing, data sharing).
  • Facilitate discovery with stakeholders, clarify requirements, and communicate trade-offs and recommendations.

Hard Skills – Must have:

  • Deep hands-on Snowflake experience: warehouses, databases/schemas, stages, file formats, external tables, Snowpipe, Streams and Tasks, Time Travel/Fail-safe, query performance, micro-partitions, clustering, and cost management.
  • Strong SQL and Python for data engineering use cases.
  • ELT-first development with dbt or an equivalent modeling/testing/documentation framework.
  • Experience with modern ingestion and orchestration (Fivetran, ADF, Glue, Matillion, Airflow, or Dagster).
  • Cloud experience (AWS preferred; Azure or GCP considered), including storage, IAM, networking basics, and secrets management.
  • CI/CD for data with Git-based workflows and environment promotion.
  • Familiarity with Infrastructure as Code (e.g., Terraform Snowflake provider).
  • Solid understanding of dimensional modeling and data quality practices.

Nice to have/It’s a plus:

  • Snowpark (Python), UDFs or stored procedures, and Dynamic Tables.
  • Streaming or CDC with Kafka, Kinesis, or Debezium; event-driven patterns.
  • BI or semantic layer exposure (Power BI, Tableau, Looker); metrics layer concepts.
  • Experience with Azure Synapse, Microsoft Fabric, or Databricks in lakehouse contexts.
  • Security and compliance exposure (PII handling, encryption, auditing, regulated environments).

Soft Skills:

  • Proven leadership with coaching, feedback, objective setting, and conflict resolution.
  • Excellent stakeholder communication; able to facilitate discovery and manage expectations.
  • Ownership mindset with proactive risk identification, prioritization, and mitigation.
  • Structured problem-solving; comfortable with ambiguity and change.

n

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.