Enable job alerts via email!
A leading data solutions firm in India seeks a skilled Senior Data Engineer to design ETL/data pipelines and lead cloud migration efforts using Snowflake and AWS. The ideal candidate has over 6 years of experience, particularly in data engineering and cloud migration, and excels in Python programming. This role emphasizes collaboration and technical mentorship, ensuring robust data solutions align with business objectives.
We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable ETL/data pipelines while leading our cloud migration and modernization initiatives. The role involves leveraging Snowflake, AWS, and Python to deliver robust, high-performing data solutions for analytics, reporting, and business intelligence.
Design and implement scalable ETL/data pipelines using Python and Luigi for high-volume clickstream, demographic, and business data.
Ensure efficient processing, code quality, and performance optimization across pipelines.
Configure, deploy, and maintain AWS infrastructure (EC2, S3, RDS, EMR) ensuring scalability, availability, and security.
Manage data storage/retrieval workflows using S3 and SQL-based storage solutions.
Provide architectural guidance for cloud-native data solutions and infrastructure.
Oversee and modernize legacy frameworks, recommending cloud migration strategies.
Lead large-scale data migration initiatives to Snowflake, ensuring security, integrity, and minimal disruption.
Collaborate with stakeholders to align infrastructure changes with business goals and budgets.
Develop monitoring and alerting solutions to track pipeline health, accuracy, and performance.
Drive incident response, root cause analysis, and post-mortem reviews for critical data issues.
Document workflows, troubleshooting procedures, and maintain system transparency.
Mentor and guide team members, fostering a culture of technical excellence and knowledge sharing.
6+ years of experience in Data Engineering or related field.
Minimum 4 years of hands-on experience in Snowflake (schema design, Snowpipe, COPY, tuning, access controls).
Strong Python programming skills for automation and data workflows.
Expertise in SQL databases for storage, query optimization, and performance tuning.
Proven experience with AWS services (S3, EC2, RDS, EMR) in a data engineering context.
Experience with monitoring and alerting tools for proactive issue detection.
Strong knowledge of ETL best practices, cloud migration, and large-scale data modernization.
Experience with CI/CD for data pipelines (Git, Jenkins, or similar).
Exposure to BI/analytics integration with Snowflake.
Familiarity with Airflow or other workflow orchestration frameworks.