Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Data Pipeline Engineer (Python / Airflow)

Global Connect Technologies

Xico

Presencial

MXN 1,434,000 - 1,794,000

Jornada completa

Ayer
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A technology firm based in Veracruz, Xico is looking for a Senior Data Pipeline Engineer. The ideal candidate will design, build, and operate secure data pipelines, leading Python-based ETL/ELT development and managing Airflow orchestrations. With a strong background in backend engineering and demonstrated expertise in data pipeline management, candidates should possess excellent communication skills and the ability to work autonomously. Experience with cloud environments and SQL is essential, along with familiarity with best security practices.

Formación

  • 8+ years data or backend engineering experience with strong Python skills.
  • 2+ years Airflow 2.X expertise.
  • Proven track record building reliable ETL / ELT pipelines.

Responsabilidades

  • Build and maintain Airflow 2.X DAGs.
  • Develop robust Python ETL / ELT pipelines.
  • Operate Airflow on Azure / Kubernetes.

Conocimientos

Strong Python skills
Data pipeline development
Airflow 2.X expertise
Strong SQL skills
Communication

Herramientas

Terraform
Docker
Kubernetes
Apache Airflow
SQL
Descripción del empleo

Job Description : We are seeking an experienced Senior Data Pipeline Engineer to design, build, and operate secure, reliable, and cost-efficient data pipelines supporting our Android connected and infotainment experiences.

You will lead Python-based ETL / ELT development, Airflow orchestration, and data platform operations across cloud environments, working closely with Android, backend, and product teams.

Responsibilities
  • Build and maintain Airflow 2.X DAGs (TaskFlow, dynamic DAGs, deferrable operators, providers).
  • Develop robust Python ETL / ELT ingesting from APIs, storage, message buses, and databases.
  • Operate Airflow on Azure / Kubernetes; support blue / green and canary DAG releases.
  • Implement data quality testing, monitoring, observability, and lineage.
  • Design scalable batch and streaming pipelines with strong schema management.
  • Manage SQL and blob data stores (partitioning, clustering, retention).
  • Enforce security best practices, IAM, RBAC, secrets management, and data contracts.
  • Build CI / CD pipelines and IaC (Terraform, Docker, Helm).
  • Optimize cost / performance and document runbooks and decisions.
Qualifications
  • 8+ years data or backend engineering experience with strong Python skills.
  • 2+ years Airflow 2.X expertise.
  • Proven track record building reliable ETL / ELT pipelines (batch + streaming).
  • Strong SQL and experience with major warehouses (BigQuery, Redshift, Snowflake).
  • Familiarity with IAM, OAuth / OIDC, secrets management, and monitoring.
  • Excellent communication and ability to work autonomously.
Nice to Have
  • Terraform, Kubernetes, Docker, Spark / Beam, Kafka / Event Hubs, dbt, Delta Lake, feature stores, automotive / IoT experience.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.