Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Data Platform Architect

Baubap

México

Híbrido

MXN 50,000 - 70,000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A tech company in Mexico is seeking a Data Engineer to define and implement foundational data architecture that supports growth. You will build and maintain scalable data solutions, mentor the engineering team, and ensure reliable data delivery at terabyte scale. The ideal candidate has 7+ years of experience in data engineering, strong AWS skills, and the ability to balance strategic design with hands-on implementation. Health insurance and a Christmas bonus are offered.

Servicios

1 month Christmas bonus (Aguinaldo)
Health & Life insurance

Formación

  • 7+ years in data engineering/architecture, proven experience at TB+ scale.
  • Strong expertise in AWS (Aurora RDS, Redshift, S3, Glue, DMS).
  • Ability to communicate trade-offs and designs to both technical and non-technical stakeholders.

Responsabilidades

  • Define and implement the transition from monolithic analytical workloads to a scalable data warehouse/lake.
  • Build and maintain ETL/ELT pipelines (batch + streaming).
  • Collaborate with data scientists, analysts, and backend engineers to translate needs into infrastructure.

Conocimientos

Data engineering
Data architecture
AWS
SQL
Python
Pipeline orchestration
Data warehousing
Distributed systems

Herramientas

AWS Aurora
Redshift
S3
Airflow
Glue
Descripción del empleo

Design and implement the foundational data architecture that will power our next stage of growth. This role is hybrid: highly strategic in defining long-term standards and architecture, while also hands‑on in building the first pipelines, storage layers, and frameworks that make data reliable, scalable, and accessible.

Unlike roles limited to pipeline maintenance, this position is responsible for the entire backbone of Baubap’s data ecosystem: migrating analytical workloads away from the monolithic architectures, standardizing schema practices, enabling distributed storage, and ensuring financial and behavioral data can be trusted at terabyte scale. This person will also be responsible for mentoring and growing the Data Engineering team, establishing a culture of quality and ownership.

Expected Outcome
  • Define and implement the transition from monolithic RDS analytical workloads to a scalable data warehouse/lake.
  • Standardize schema and database practices across the company.
  • Build and maintain pipelines (batch and real-time) that handle terabyte‑scale datasets from multiple sources (mobile, financial, behavioral).
  • Implement monitoring, validation, and alerting to ensure reliable data delivery.
  • Ensure financial datasets (payments, collections, reports) are reliable, reconcilable, and auditable, reducing inconsistencies.
  • Own the AWS‑based data stack (Aurora, Redshift, S3, Airflow, DMS, Glue), balancing reliability, cost, and scalability.
  • Evaluate and introduce tools for orchestration, observability, and optimization.
  • Mentor backend and data engineers, creating a culture of excellence in data engineering.
  • Define architectural standards, coding practices, and documentation templates.
Day to Day Tasks
  • Build and maintain ETL/ELT pipelines (batch + streaming).
  • Optimize large‑scale queries and data models for performance.
  • Manage and partition storage solutions (S3, Redshift) for efficiency.
  • Audit and fix bottlenecks in financial data flows.
  • Collaborate with data scientists, analysts, and backend engineers to translate needs into infrastructure.
  • Document architecture, schemas, and pipelines for long‑term clarity.
  • Propose and implement improvements in cost efficiency, resilience, and reliability.
Why You Should Apply
  • 7+ years in data engineering/architecture, with proven experience at TB+ scale.
  • Experience designing and implementing data platforms
  • Strong expertise in AWS (Aurora RDS, Redshift, S3, Glue, DMS).
  • Mastery of SQL and Python for data workflows.
  • Experience with pipeline orchestration (Airflow, Step Functions, etc.).
  • Solid knowledge of data warehousing and data lakes (partitioning, schema design, optimization).
  • Familiarity with distributed systems and performance tuning at scale.
  • Experience balancing strategic design and hands‑on execution.
  • Knowledge of CI/CD for data pipelines, access control, monitoring.
  • Ability to communicate trade‑offs and designs to both technical and non‑technical stakeholders.
  • Experience mentoring engineers or growing data teams.
What We Can Offer
  • Being part of a multicultural, highly driven team of professionals
  • 1 month (proportional) of Christmas bonus (Aguinaldo)
  • Health & Life insurance
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.