Enable job alerts via email!

Senior Data Engineer (Snowflake & AWS)

New Era Solutions

India

On-site

INR 15,00,000 - 25,00,000

Full time

Today
Be an early applicant

Job summary

A leading data solutions firm in India seeks a skilled Senior Data Engineer to design ETL/data pipelines and lead cloud migration efforts using Snowflake and AWS. The ideal candidate has over 6 years of experience, particularly in data engineering and cloud migration, and excels in Python programming. This role emphasizes collaboration and technical mentorship, ensuring robust data solutions align with business objectives.

Qualifications

  • 6+ years in Data Engineering or related field.
  • 4 years of hands-on experience in Snowflake.
  • Strong Python skills for automation and workflows.
  • Expertise in SQL databases for query optimization.
  • Experience with AWS services for data engineering.
  • Familiarity with monitoring and alerting tools.

Responsibilities

  • Design scalable ETL/data pipelines using Python and Luigi.
  • Manage AWS infrastructure ensuring availability and security.
  • Lead data migration initiatives to Snowflake.
  • Collaborate with stakeholders for alignment with business goals.
  • Mentor team members and foster technical excellence.

Skills

Python programming
Data Engineering
ETL best practices
SQL databases
AWS services

Tools

Snowflake
AWS
Luigi
Job description
About the Role

We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable ETL/data pipelines while leading our cloud migration and modernization initiatives. The role involves leveraging Snowflake, AWS, and Python to deliver robust, high-performing data solutions for analytics, reporting, and business intelligence.

Key Responsibilities
  • Design and implement scalable ETL/data pipelines using Python and Luigi for high-volume clickstream, demographic, and business data.

  • Ensure efficient processing, code quality, and performance optimization across pipelines.

  • Configure, deploy, and maintain AWS infrastructure (EC2, S3, RDS, EMR) ensuring scalability, availability, and security.

  • Manage data storage/retrieval workflows using S3 and SQL-based storage solutions.

  • Provide architectural guidance for cloud-native data solutions and infrastructure.

  • Oversee and modernize legacy frameworks, recommending cloud migration strategies.

  • Lead large-scale data migration initiatives to Snowflake, ensuring security, integrity, and minimal disruption.

  • Collaborate with stakeholders to align infrastructure changes with business goals and budgets.

  • Develop monitoring and alerting solutions to track pipeline health, accuracy, and performance.

  • Drive incident response, root cause analysis, and post-mortem reviews for critical data issues.

  • Document workflows, troubleshooting procedures, and maintain system transparency.

  • Mentor and guide team members, fostering a culture of technical excellence and knowledge sharing.

Required Qualifications
  • 6+ years of experience in Data Engineering or related field.

  • Minimum 4 years of hands-on experience in Snowflake (schema design, Snowpipe, COPY, tuning, access controls).

  • Strong Python programming skills for automation and data workflows.

  • Expertise in SQL databases for storage, query optimization, and performance tuning.

  • Proven experience with AWS services (S3, EC2, RDS, EMR) in a data engineering context.

  • Experience with monitoring and alerting tools for proactive issue detection.

  • Strong knowledge of ETL best practices, cloud migration, and large-scale data modernization.

Preferred Skills (Nice to Have)
  • Experience with CI/CD for data pipelines (Git, Jenkins, or similar).

  • Exposure to BI/analytics integration with Snowflake.

  • Familiarity with Airflow or other workflow orchestration frameworks.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.