Job Search and Career Advice Platform

Enable job alerts via email!

Snowflake -Senior Data engineer

Uplers

Remote

INR 25,00,000 - 45,00,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A boutique technology consulting firm is seeking a Data Modernization Lead to oversee data transformation projects. The ideal candidate will have extensive Snowflake experience, proficiency in SQL, and an ability to migrate data from various legacy systems. This role emphasizes cloud modernization and offers remote work flexibility. Strong leadership and problem-solving skills are essential for collaborating with cross-functional teams and delivering high-quality engineering solutions.

Benefits

Flexible work model
Direct access to leadership
Opportunity to work on cutting-edge projects

Qualifications

  • Strong experience with Snowflake components like stored procedures and data loading.
  • Proficiency in SQL and experience with large-scale query optimization.
  • Hands-on experience with various ETL tools including AWS Glue.

Responsibilities

  • Lead end-to-end data modernization initiatives.
  • Design, develop, and optimize Snowflake objects.
  • Build and maintain ETL workflows using SSIS or AWS Glue.
  • Automate routine tasks and orchestrate data pipelines.

Skills

Experience with Snowflake
Proficiency in SQL
Experience with Greenplum
Hands-on with ETL tools
Cloud experience (AWS/Azure)

Education

B.E./B.Tech/M.Tech/MCA in Computer Science

Tools

SSIS
Matillion
AWS Glue
Job description

Experience: 3.00+ years

Salary: INR 2,500,000 – 4,500,000 / year (based on experience)

Shift: GMT+05:30, Asia/Kolkata (IST)

Opportunity Type: Remote

Placement Type: Full‑time Permanent Position

Note: This is a requirement for one of Uplers' clients – Nuaav

Company Overview

Nuaav is a boutique technology consulting firm specializing in scalable data engineering, cloud modernization, and AI‑driven transformation. We partner with enterprises to build modern data platforms, streamline migrations, and deliver high‑quality engineering solutions with agility and precision.

Role Summary
  • Lead end‑to‑end data modernization initiatives, including Greenplum → Snowflake migrations, ETL/ELT development, performance tuning, and automated data processing workflows.
  • Collaborate with architects, product teams, and clients to build secure, scalable, and efficient data ecosystems.
Key Responsibilities
  • Design, develop, and optimize Snowflake objects—tables, views, stored procedures, tasks, streams, Snowpipe, and COPY pipelines.
  • Migrate logic from Greenplum / Hadoop / SQL Server / Oracle into Snowflake stored procedures and scripts.
  • Support end‑to‑end migration cycles: schema migration, code refactoring, unit testing, data validation, and performance tuning.
  • Build and maintain ETL/ELT workflows using SSIS, Matillion, Talend, AWS Glue, or custom Python/SQL‑based pipelines.
  • Implement ingestion frameworks for relational, NoSQL, and cloud sources into Snowflake, Redshift, or S3.
  • Automate routine tasks, orchestrate pipelines, and monitor jobs using Control‑M, AWS Lambda, Azure DevOps, or similar tools.
  • Tune complex SQL queries using Snowflake Query Profile, CTEs, dynamic SQL, clustering, partitioning, and caching techniques.
  • Validate data between legacy and target environments, ensuring quality, performance, and reliability.
  • Contribute to cloud modernization POCs, performance benchmarking, and DevOps practices (Git, CI/CD, automated deployments).
Required Skills & Experience
  • Strong experience with Snowflake (stored procedures, UDFs, tasks, streams, Time Travel, Cloning, Fail‑safe, Snowpipe, COPY INTO, staging, and data loading).
  • Proficiency in SQL and large‑scale query optimization.
  • Experience with Greenplum, PostgreSQL, or equivalent MPP systems.
  • Hands‑on with ETL tools such as SSIS, Matillion, Talend, AWS Glue, Informatica, etc.
  • Cloud experience (AWS/Azure) including S3, Lambda, Redshift, or equivalent services.
Preferred / Good To Have
  • Python scripting for data processing and validation.
  • Experience migrating SQL logic between heterogeneous systems (e.g., Oracle → Snowflake, Greenplum → Snowflake, SQL Server → Redshift).
  • Knowledge of microservices or API integrations.
  • Familiarity with CI/CD pipelines, Git, Bitbucket, GitHub.
  • Exposure to reporting tools like Power BI or Tableau.
Soft Skills
  • Strong problem‑solving, analytical, and debugging abilities.
  • Ability to collaborate in agile, multi‑stakeholder environments.
  • Excellent communication and documentation skills.
Education
  • B.E./B.Tech/M.Tech/MCA in Computer Science, Information Technology, or a related field.
Why Work With Nuaav?
  • Work on large‑scale Snowflake migration projects and modern cloud data platforms.
  • High ownership and meaningful work—your decisions matter.
  • Opportunity to learn across data engineering, cloud, and AI initiatives.
  • Fast‑paced but supportive consulting environment.
  • Direct access to leadership and architects.
  • Flexible work model (Noida office + hybrid/remote options).
How to Apply
  • Step 1: Click “Apply” and register or log in on our portal.
  • Step 2: Complete the screening form and upload your updated resume.
  • Step 3: Increase your chances of shortlisting and interview with the client.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.