¡Activa las notificaciones laborales por email!

855 - Technical Operations Engineer · Senior · Remote · Latam

buscojobs Argentina

Córdoba

A distancia

EUR 30.000 - 50.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A global technology solutions provider is seeking a Technical Operations Engineer to ensure the stability and reliability of data pipelines. This remote role requires over 5 years in Technical Operations and strong expertise in Python and cloud platforms. You will collaborate with various teams to optimize workflows and enhance data delivery. The position offers a contractor agreement with payment in USD and a 100% remote work environment.

Servicios

English classes
100% remote work
Referral program
Access to learning platforms

Formación

  • 5+ years in Technical Operations, DevOps, or SRE with focus on data platforms.
  • Proven experience managing enterprise-grade data services.
  • Ability to work with stakeholders in EST timezone.

Responsabilidades

  • Own and ensure the health, stability, and performance of AI-driven data platforms.
  • Monitor, troubleshoot, and optimize complex ETL workflows.
  • Develop Python-based scripts to automate deployments.

Conocimientos

Expert Python skills for automation and operational tooling
Strong cloud experience (AWS or GCP)
SQL proficiency with BigQuery, Redshift, or Snowflake
Excellent communication skills

Educación

Bachelors or Masters in Computer Science, Engineering, or related field

Herramientas

AWS
GCP
Elasticsearch
Descripción del empleo
Overview

Technical Operations Engineer

Location: LATAM (Anywhere in LATAM). Job Type: Remote Full Time Contractor. Project: Life Sciences & Healthcare Data Intelligence. Time Zone: EST overlap required. English Level: B2 / C1.

Get to Know Us

At Darwoft, we partner with cutting-edge companies around the world to build digital products that create real impact. One of our clients is a leading Life Sciences and Healthcare data intelligence company that is transforming decision-making by empowering global organizations with advanced analytics, real-time insights, and AI-driven platforms.

Their technology is used by some of the world’s top life sciences organizations to gain visibility on the efficacy and impact of scientific engagement, helping to improve patient outcomes at scale.

By joining this team, you’ll be contributing to a mission-driven environment where technology, data, and healthcare meet to create meaningful change.

About the Role

We’re seeking a Technical Operations Engineer who will take ownership of ensuring the stability, scalability, and reliability of complex data pipelines and infrastructure. This role is highly autonomous and hands-on, requiring someone who thrives in a fast-paced, collaborative environment, with strong expertise in Python, cloud platforms, and data operations.

You’ll work across teams to operationalize data architectures, optimize pipelines, and deliver robust solutions that drive both internal efficiency and customer satisfaction.

What You’ll Be Doing
  • Operational Reliability: Own and ensure the health, stability, and performance of AI-driven data platforms, pipelines, and infrastructure.
  • Pipeline Optimization: Monitor, troubleshoot, and optimize complex ETL / ELT workflows, ensuring data quality and availability.
  • Automation: Develop Python-based scripts and tools to automate deployments, workflows, and system maintenance.
  • Advanced Support: Collaborate with Engineering, Product, and Customer Success to resolve complex operational issues and ensure seamless data delivery.
  • Documentation: Build and maintain detailed operational runbooks, incident playbooks, and system guides.
  • Collaboration: Work in an Agile environment with engineers, product managers, and data scientists to operationalize analytics for Life Sciences & Healthcare datasets.
What You Bring
  • Bachelors or Masters in Computer Science, Engineering, or related field.
  • 5+ years in Technical Operations, DevOps, or SRE with focus on data platforms.
  • Proven experience managing enterprise-grade data services (Data Pipelines, Data Lakes, Warehouses).
  • Expert Python skills for automation and operational tooling.
  • Strong cloud experience (AWS or GCP) including compute, storage, databases, containerization, orchestration.
  • SQL proficiency with BigQuery, Redshift, or Snowflake.
  • Deep knowledge of ETL / ELT best practices, data governance, and compliance.
  • Ability to diagnose complex distributed systems issues with strong RCA skills.
  • Excellent communication (verbal & written) and experience creating technical documentation.
  • Collaborative, proactive mindset with strong ownership.
  • Ability to work with stakeholders in EST timezone.
  • Experience in regulated industries (healthcare, finance) and compliance (HIPAA).
Nice to Have
  • Experience in Life Sciences / Healthcare data domain.
  • Knowledge of MLOps and deploying AI / ML models.
  • Familiarity with data visualization tools (Looker, PowerBI, etc.).
  • Experience with Elasticsearch or search technologies.
  • Understanding of ML frameworks (TensorFlow, PyTorch, Scikit-learn, MLFlow).
What Darwoft Offers

Contractor agreement with payment in USD

100% remote work

Argentinas public holidays

English classes

Referral program

Access to learning platforms

Explore this and other opportunities at :

www.darwoft.com / careers

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.