Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Chief Engineering Architect

beBeeDataPipeline

México

Presencial

MXN 1,446,000 - 1,809,000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A data engineering company in Mexico is seeking a Senior Data Pipeline Engineer to design and operate data pipelines using Apache Airflow and Python. The successful candidate will lead ETL/ELT processes at scale with a focus on testing and monitoring. Requirements include 8+ years in data engineering or backend engineering, strong Python skills, and experience with Airflow 2.x. The role is vital for optimizing data processes and improving data quality in a collaborative environment.

Servicios

Competitive salary and benefits
Collaborative work environment
Ongoing training and development

Formación

  • 8+ years of experience in data engineering or backend engineering.
  • Strong expertise in Python.
  • 2+ years of experience with Airflow 2.x.

Responsabilidades

  • Design and build Airflow DAGs using TaskFlow and deferrable operators.
  • Develop Python ETL/ELT code for data ingestion.
  • Implement data quality testing for operators/hooks.

Conocimientos

Data engineering
Backend engineering
Python
Apache Airflow
ETL/ELT processes
Data quality and testing
Cross-functional teamwork

Herramientas

Apache Airflow 2.x
Terraform
Docker
Descripción del empleo
Senior Data Pipeline Engineer
Job Summary

We are seeking a highly skilled Senior Data Pipeline Engineer to join our team. As a key member of our engineering organization, you will be responsible for designing and operating reliable, secure, and cost-efficient data pipelines built with Apache Airflow (2.x) and Python.

The ideal candidate has 8+ years of experience in data engineering or backend engineering with strong Python expertise. They should have 2+ years of experience with Airflow 2.x, including operators, hooks, sensors, TaskFlow, and scheduler tuning.

In this role, you will lead the design of ETL/ELT processes at scale, with a focus on robust testing and monitoring. You will also work closely with Android and backend teams to define interfaces and data contracts, document decisions, and maintain operational runbooks.

Key Responsibilities
  • Design and build Airflow DAGs using TaskFlow, dynamic DAGs, deferrable operators, providers, and the Secrets backend
  • Develop Python ETL/ELT code to ingest from APIs, object storage, message buses, and databases
  • Operate Airflow on managed or self-hosted platforms, implementing blue/green or canary DAG releases
  • Implement data quality and testing with unit tests for operators/hooks, and DAG validation in CI
  • Build event-driven pipelines for near-real-time processing, manage schemas and compatibility
  • Model and manage data stores across SQL and blob storage, design partitioning, clustering, and retention
  • Observability & lineage: instrument metrics/logs, set SLAs/alerts, drive post-incident reviews and reliability improvements
  • Security & governance: apply least-privilege IAM, secrets management, PII handling, and data contracts; enforce RBAC in Airflow and warehouses
  • CI/CD & IaC: build pipelines to lint/test/deploy DAGs and Python packages, provision infra with Terraform/Helm, containerize with Docker
  • Cost & performance: tune task parallelism, autoscaling, storage formats, and compute footprints to optimize cost/perf
Requirements
  • 8+ years of experience in data engineering or backend engineering
  • Strong Python expertise
  • 2+ years of experience with Airflow 2.x
  • Knowledge of ETL/ELT processes
  • Experience with data quality and testing
  • Ability to work closely with cross-functional teams
What We Offer
  • A competitive salary and benefits package
  • A collaborative and dynamic work environment
  • Ongoing training and development opportunities
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.