Job Title: Senior Data Engineer (Snowflake & Observability Implementation)
Location: LATAM [Remote]
Duration: 6 Months
Need to work on PST hours
Mandatory Skills : Snowflake
DBT, Splunk, OpsGenie, Snowflake + Streamlit, Splunk Dashboards and Version Control & IaC
Role Overview
We are seeking a proactive and hands-on Senior Data Engineer to drive the execution phase of our migration from Amazon Redshift to Snowflake. The architectural design is complete; we need a builder who can code, implement, and operationalize the solution with a heavy focus on Good Governance, Data Quality, and Observability at scale.
You will pick up where a previous engagement left off, leveraging DBT for transformations and driving the roadmap for 2026. You will work alongside an existing Data Engineer but are expected to lead the implementation with high initiative.
Key Responsibilities
1. Migration & Transformation Execution
- Lead the Implementation: Execute the transition from Redshift to Snowflake based on existing designs.
- DBT Transformation: Utilize DBT to manage transformations for DMS (Database Migration Service) outputs and ensure seamless data integration.
- Code & Drive: This is not a passive role. You will be responsible for writing the code, configuring the environment, and driving the project forward to meet 2026 roadmap goals.
- Measure Output: Implement metrics to validate and measure the output of DMS processes to ensure accuracy.
2. Data Observability & Alerting at Scale
- Enterprise Alerting Integration: Operationalize a Proof of Concept (POC) that connects Splunk to OpsGenie.
- Scale Observability: Deploy observability checks (Freshness, Validity, Volume) across all tables in the ecosystem.
- Incident Routing: Configure automated workflows to ensure alerts reach the correct contact immediately via OpsGenie.
3. Governance & Quality
- Data Governance: Enforce "Good Governance" practices, including rigorous Data Quality (DQ) checks and comprehensive documentation.
- Change Management: Manage the change process for data pipelines and schema evolution.
- Advanced DQ Tooling: Operationalize a DQ POC using Atlan, specifically leveraging its natural language capabilities for data discovery and quality management.
Qualifications
Technical Skills
- Snowflake (Expert): Deep experience with Snowflake architecture and migration patterns (Redshift to Snowflake preferred).
- DBT (Required): Strong proficiency in DBT for data modeling and transformation.
- Observability Stack: Hands-on experience with Splunk and OpsGenie (specifically integrating the two for automated alerting).
- Data Quality Tools: Familiarity with Atlan or similar modern data governance/catalog tools.
- AWS DMS: Experience handling output from AWS Database Migration Service or similar migration tools.
Soft Skills & Attributes
- High Initiative: You are a self-starter who identifies gaps and fixes them without needing constant direction.
- Execution-Focused: You prefer building and coding over pure theory/design.
- Communication: Ability to document technical processes clearly for future maintenance.