Enable job alerts via email!

Software Developer - Data Warehouse(Mid-Sr SnowFlake)

Anblicks

Hyderabad

On-site

INR 8,00,000 - 15,00,000

Full time

16 days ago

Job summary

A leading technology firm in Hyderabad is seeking a Software Developer for Data Warehouse to architect large-scale data intelligence solutions primarily using Snowflake. The ideal candidate will have strong experience in ETL development, SQL, and cloud technologies. Excellent communication skills and problem-solving abilities are essential. This role offers opportunities to work with cutting-edge data technologies.

Qualifications

  • Experience in architecting and implementing data intelligence solutions.
  • Solid understanding of Snowflake Data Warehouse operations.
  • Ability to develop data ingestion and processing pipelines.

Responsibilities

  • Architect and implement large scale data intelligence solutions.
  • Develop ETL pipelines for Snowflake using Python and SnowSQL.
  • Translate BI requirements to database and reporting designs.

Skills

Data architecture
ETL development
SQL
Cloud technologies (AWS/Azure/GCP)
Python
Snowflake
Java
Spark
Scala

Job description

Software Developer - Data Warehouse(Mid-Sr SnowFlake)

Mid + Senior Consultant Job description

1) Responsible for architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse.
2) Solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse.
3) Need to have knowledge of AWS/Azure or GCP.
4) Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL. Experience in developing data ingestion and processing pipelines using Java, Spark, Scala, Python.
5) Writing SQL queries against Snowflake.
6) Developing scripts Unix, Python etc. to do Extract, Load and Transform data.
7) Translate requirements for BI and Reporting to Database design and reporting design.
8) Understanding data transformation and translation requirements and which tools to leverage to get the job done.
9) Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
10) Experience in designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
11) Hands on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as Teradata, Oracle or DB2
12) Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
13) Excellent presentation and communication skills, both written and verbal ability to problem solve and architect in an environment with unclear requirements.

Liquid error: undefined method `public_fields' for nil:NilClass

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.