Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Data Pipeline Engineer

Global Connect Technologies

A distancia

MXN 1,618,000 - 2,159,000

Jornada completa

Ayer
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A leading technology company is seeking an experienced Senior Data Pipeline Engineer to design and operate secure data pipelines for Android experiences. The role requires expertise in Python and Airflow, focusing on building reliable ETL/ELT pipelines and managing cloud environments. Candidates must have a solid background in data engineering, excellent communication skills, and the ability to work autonomously. This is an opportunity to drive innovation in data solutions within a dynamic team.

Formación

  • 8+ years data or backend engineering experience with strong Python skills.
  • 2+ years Airflow 2.X expertise.
  • Proven track record building reliable ETL/ELT pipelines (batch + streaming).
  • Strong SQL and experience with major data warehouses.

Responsabilidades

  • Build and maintain Airflow 2.X DAGs.
  • Develop robust Python ETL/ELT from APIs and databases.
  • Operate Airflow on Azure/Kubernetes.
  • Implement data quality testing and monitoring.

Conocimientos

Data engineering
Python
Airflow
SQL
ETL/ELT development
Cloud environments
Automation (CI/CD)
Data quality testing

Herramientas

Airflow 2.X
Terraform
Docker
Kubernetes
BigQuery
Redshift
Snowflake
Descripción del empleo

We are seeking an experienced Senior Data Pipeline Engineer to design, build, and operate secure, reliable, and cost-efficient data pipelines supporting our Android connected and infotainment experiences. You will lead Python-based ETL/ELT development, Airflow orchestration, and data platform operations across cloud environments, working closely with Android, backend, and product teams.

Responsibilities
  • Build and maintain Airflow 2.X DAGs (TaskFlow, dynamic DAGs, deferrable operators, providers).
  • Develop robust Python ETL/ELT ingesting from APIs, storage, message buses, and databases.
  • Operate Airflow on Azure/Kubernetes; support blue/green and canary DAG releases.
  • Implement data quality testing, monitoring, observability, and lineage.
  • Design scalable batch and streaming pipelines with strong schema management.
  • Manage SQL and blob data stores (partitioning, clustering, retention).
  • Enforce security best practices, IAM, RBAC, secrets management, and data contracts.
  • Build CI/CD pipelines and IaC (Terraform, Docker, Helm).
  • Optimize cost/performance and document runbooks and decisions.
Qualifications
  • 8+ years data or backend engineering experience with strong Python skills.
  • 2+ years Airflow 2.X expertise.
  • Proven track record building reliable ETL/ELT pipelines (batch + streaming).
  • Strong SQL and experience with major warehouses (BigQuery, Redshift, Snowflake).
  • Familiarity with IAM, OAuth/OIDC, secrets management, and monitoring.
  • Excellent communication and ability to work autonomously.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.