Experience: 3.00+ years
Salary: INR 2,500,000 – 4,500,000 / year (based on experience)
Shift: GMT+05:30, Asia/Kolkata (IST)
Opportunity Type: Remote
Placement Type: Full‑time Permanent Position
Note: This is a requirement for one of Uplers' clients – Nuaav
Company Overview
Nuaav is a boutique technology consulting firm specializing in scalable data engineering, cloud modernization, and AI‑driven transformation. We partner with enterprises to build modern data platforms, streamline migrations, and deliver high‑quality engineering solutions with agility and precision.
Role Summary
- Lead end‑to‑end data modernization initiatives, including Greenplum → Snowflake migrations, ETL/ELT development, performance tuning, and automated data processing workflows.
- Collaborate with architects, product teams, and clients to build secure, scalable, and efficient data ecosystems.
Key Responsibilities
- Design, develop, and optimize Snowflake objects—tables, views, stored procedures, tasks, streams, Snowpipe, and COPY pipelines.
- Migrate logic from Greenplum / Hadoop / SQL Server / Oracle into Snowflake stored procedures and scripts.
- Support end‑to‑end migration cycles: schema migration, code refactoring, unit testing, data validation, and performance tuning.
- Build and maintain ETL/ELT workflows using SSIS, Matillion, Talend, AWS Glue, or custom Python/SQL‑based pipelines.
- Implement ingestion frameworks for relational, NoSQL, and cloud sources into Snowflake, Redshift, or S3.
- Automate routine tasks, orchestrate pipelines, and monitor jobs using Control‑M, AWS Lambda, Azure DevOps, or similar tools.
- Tune complex SQL queries using Snowflake Query Profile, CTEs, dynamic SQL, clustering, partitioning, and caching techniques.
- Validate data between legacy and target environments, ensuring quality, performance, and reliability.
- Contribute to cloud modernization POCs, performance benchmarking, and DevOps practices (Git, CI/CD, automated deployments).
Required Skills & Experience
- Strong experience with Snowflake (stored procedures, UDFs, tasks, streams, Time Travel, Cloning, Fail‑safe, Snowpipe, COPY INTO, staging, and data loading).
- Proficiency in SQL and large‑scale query optimization.
- Experience with Greenplum, PostgreSQL, or equivalent MPP systems.
- Hands‑on with ETL tools such as SSIS, Matillion, Talend, AWS Glue, Informatica, etc.
- Cloud experience (AWS/Azure) including S3, Lambda, Redshift, or equivalent services.
Preferred / Good To Have
- Python scripting for data processing and validation.
- Experience migrating SQL logic between heterogeneous systems (e.g., Oracle → Snowflake, Greenplum → Snowflake, SQL Server → Redshift).
- Knowledge of microservices or API integrations.
- Familiarity with CI/CD pipelines, Git, Bitbucket, GitHub.
- Exposure to reporting tools like Power BI or Tableau.
Soft Skills
- Strong problem‑solving, analytical, and debugging abilities.
- Ability to collaborate in agile, multi‑stakeholder environments.
- Excellent communication and documentation skills.
Education
- B.E./B.Tech/M.Tech/MCA in Computer Science, Information Technology, or a related field.
Why Work With Nuaav?
- Work on large‑scale Snowflake migration projects and modern cloud data platforms.
- High ownership and meaningful work—your decisions matter.
- Opportunity to learn across data engineering, cloud, and AI initiatives.
- Fast‑paced but supportive consulting environment.
- Direct access to leadership and architects.
- Flexible work model (Noida office + hybrid/remote options).
How to Apply
- Step 1: Click “Apply” and register or log in on our portal.
- Step 2: Complete the screening form and upload your updated resume.
- Step 3: Increase your chances of shortlisting and interview with the client.