786 - Data Engineer Ssr (Python/SQL/Snowflake/AWS) · LATAM
Join to apply for the 786 - Data Engineer Ssr (Python/SQL/Snowflake/AWS) · LATAM role at Darwoft
786 - Data Engineer Ssr (Python/SQL/Snowflake/AWS) · LATAM
1 day ago Be among the first 25 applicants
Join to apply for the 786 - Data Engineer Ssr (Python/SQL/Snowflake/AWS) · LATAM role at Darwoft
Get AI-powered advice on this job and more exclusive features.
- LATAM
- Location: Anywhere in LATAM
- Job Type: Remote
- Project: Data Engineering for US-based Health Client
- Time Zone: GMT-3 to GMT-5 preferred
Senior Data Engineer (Snowflake/Airflow/Python/AWS)
- LATAM
- Location: Anywhere in LATAM
- Job Type: Remote
- Project: Data Engineering for US-based Health Client
- Time Zone: GMT-3 to GMT-5 preferred
- English Level: B2 / C1
Get to Know Us
At Darwoft, we build digital products with heart. Were a Latin American tech company focused on creating impactful, human-centered software in partnership with companies around the globe. Our remote-first culture is based on trust, continuous learning, and collaboration.
Were passionate about tech but even more about people. If youre looking to join a team where your ideas matter and your impact is real, welcome to Darwoft.
Were Looking For a Senior Data Engineer
Youll be joining a fast-moving, collaborative environment where your role will focus on designing and optimizing data pipelines using Python, Snowflake, Airflow, and AWS. Youll work closely with data scientists and analysts to build scalable solutions that support critical business decisions.
What Youll Be Doing
70% Data Pipeline Development
- Design and optimize ingestion, storage, and transformation pipelines using Python, SQL, Snowflake, and Snowpark
- Build and enhance real-time data pipelines with AWS Lambda and Snowpipe
- Collaborate with data scientists and analysts to deliver business-ready datasets
- Create internal and external data views (logical, materialized, and secure)
- Test and evaluate new features in Snowflake, Airflow, and AWS for proof of concepts
15% Code Review
- Participate in peer code reviews and provide constructive feedback
- Maintain clean, efficient, and scalable code
10% Agile Collaboration
- Join sprint ceremonies (planning, stand-ups, reviews, retrospectives)
- Ensure alignment with stakeholders on deliverables and timelines
5% Release Support
- Coordinate deployments with PMs and IT
- Ensure smooth release cycles with minimal downtime
What You Bring
- 5+ years of experience as a Data Engineer
- Strong expertise with Snowflake and orchestration tools like Airflow
- Advanced Python and SQL programming skills
- Hands-on experience with AWS services: Lambda, S3, and real-time data streaming
- Solid understanding of ELT pipelines, data modeling, and efficient storage strategies
- Great communication and collaboration skills
Nice to Have
- Experience working with healthcare data in the US
- Familiarity with data privacy regulations (HIPAA, GDPR)
- Experience with Snowpark and Snowflakes data sharing capabilities
Education
- Bachelors degree in Computer Science, Engineering, or a related field
Perks & Benefits
- Contractor agreement with payment in USD
- 100% remote work
- Argentinas public holidays
- English classes
- Referral program
- Access to learning platforms
Explore More Opportunities
Check out all our open roles at www.darwoft.com/careers
Seniority level
Seniority level
Not Applicable
Employment type
Job function
Job function
Information TechnologyIndustries
Software Development
Referrals increase your chances of interviewing at Darwoft by 2x
Sign in to set job alerts for “Data Engineer” roles.
Software Engineer (Python) Career Opportunities at Dev.Pro - 01
Autonomous City of Buenos Aires, Buenos Aires Province, Argentina 1 month ago
Software Support Engineer with Python - Remote
Senior Data Engineer / Full Remote / U$S
Autonomous City of Buenos Aires, Buenos Aires Province, Argentina 1 month ago
Data Engineer with Databricks - AR-Buenos Aires - Senior or Expert Level
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.