Lead Data Engineer Snowflake & Python
Fulltime
Remote
Position Overview:
We are seeking a highly skilled Lead Data Engineer with deep expertise in Snowflake , SQL and Python, and exposure to MuleSoft-based integrations. This role will be instrumental in architecting and optimizing scalable data solutions that support AI foundations, data reconciliation, and enterprise-wide automation strategies. You will lead the development of robust Snowflake-based data pipelines, support real-time and batch integrations, and drive operational excellence across the data platform.
Key Responsibilities:
- Design, build, and deploy scalable and high-performance data models and pipelines in Snowflake to support foundational AI/ML, data reconciliation, and automation use cases.
- Develop and optimize complex Snowflake workloads, including ingestion, transformation (Streams, Tasks), and secure data sharing, with a focus on performance and cost efficiency.
- Write efficient and reusable Python scripts to automate Snowflake operations such as data loading, transformation orchestration, and user/role management.
- Support and collaborate on MuleSoft configurations and flows to enable real-time and batch data ingestion into Snowflake using reusable APIs and connectors.
- Diagnose and resolve issues impacting Snowflake data processing, including upstream failures from MuleSoft or other external data sources.
- Maintain secure, compliant Snowflake environments (dev, test, prod) aligned with best practices in data governance and access control.
- Partner with cross-functional teams to define data contracts, integration strategies, and ensure data quality and traceability within Snowflake.
- Continuously evaluate and recommend improvements to Snowflake performance, automation workflows, and MuleSoft-based data integrations.
Required Qualifications:
- 7+ years of experience in data engineering or data platform roles.
- Strong hands-on expertise in Snowflake (data modeling, performance tuning, Streams & Tasks, data sharing).
- Advanced Python skills for scripting, automation, and orchestration.
- Experience with ETL/ELT pipelines and workflow orchestration tools.
- Exposure to MuleSoft or similar iPaaS platforms (e.g., Boomi, Apache Nifi, Talend) for integrating data from enterprise systems.
- Solid understanding of data governance, access control, and environment management in cloud data platforms.
- Experience collaborating with business and analytics teams to define and implement integration strategies.
- Excellent troubleshooting skills and ability to resolve complex data flow issues in distributed environments.
Preferred Qualifications:
- Snowflake certification(s) (e.g., SnowPro Core or Advanced).
- Familiarity with cloud platforms such as AWS, Azure, or GCP.
- Experience in building data solutions that support AI/ML or intelligent automation.
- Understanding of API-led integration principles and RESTful API design.