We are seeking a Senior Snowflake Developer / Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization.
The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions.
Your Role :
- Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies.
- Write optimized SQL queries for data extraction, transformation, and loading.
- Utilize Python for advanced data processing, automation tasks, and system integration.
- Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices.
- Develop and maintain complex data pipelines and ETL processes in Snowflake.
- Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions.
- Automate DBT Jobs & build CI / CD pipelines using Azure DevOps for seamless deployment of data solutions.
- Ensure data quality, integrity, and compliance throughout the data lifecycle.
- Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements.
- Document data models, processes, and workflows clearly for future reference and knowledge sharing.
- Build Data tests, Unit tests and mock data frameworks.
Who You Are :
- Bachelor’s or master’s degree in computer science, mathematics, or related fields.
- At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist.
- In depth knowledge of Snowflake components including Security and Governance
- Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault)
- A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery.
- Proven industry experience with DBT and JINJA scripts
- Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous.
- Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have.
- Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus.
- Experience with Healthcare R&D is a plus.
- Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers.
- Experience working in virtual and agile teams.
Role Expert 3
J-18808-Ljbffr