Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Lead Data Engineer

Cobre

Ciudad de México

Presencial

MXN 600,000 - 800,000

Jornada completa

Ayer
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A leading B2B payments platform in Ciudad de México is seeking an experienced Data Engineer to enhance its data infrastructure. The role involves building real-time data pipelines, leading a team, and ensuring the delivery of impactful data products. Candidates should have 5+ years in data engineering, proficiency in SQL and Python, and experience with platforms like Snowflake and Kafka. This position offers the opportunity to shape the future of financial infrastructure in Latin America.

Formación

  • 5+ years in data engineering; 2+ leading a team or major workstream.
  • Experience building production data pipelines.
  • Strong SQL and Python skills required.

Responsabilidades

  • Assess and improve current data stack for scale.
  • Build real-time data pipelines and supporting infrastructure.
  • Lead and grow engineering team.

Conocimientos

Data pipeline construction
SQL
Python
CI/CD practices
AWS
Data quality management
Team leadership

Herramientas

Snowflake
Airflow
Kafka
Terraform
ELK
Descripción del empleo

Cobre is Latin America’s leading instant b2b payments platform. We solve the region’s most complex money movement challenges by building advanced financial infrastructure that enables companies to move money faster, safer, and more efficiently.

We enable instant business payments—local or international, direct or via API—all from a single platform.

Built for fintechs, PSPs, banks, and finance teams that demand speed, control, and efficiency. From real-time payments to automated treasury, we turn complex financial processes into simple experiences.

Cobre is the first platform in Colombia to enable companies to pay both banked and unbanked beneficiaries within the same payment cycle and through a single interface.

We are building the enterprise payments infrastructure of Latin America!

What we are looking for:

We're building a data platform that does three things: powers our internal operations, AI and risk capabilities, and ships data products directly to clients. Right now, we have the foundation. We need someone to turn it into a competitive advantage.

You’ll report to the Chief Data Officer and work with a team of +15—data engineers, ML scientists, risk analytics engineers, and PMs. We run an AI center of excellence, and this platform is its backbone.

What would you be doing:
  • The current stack — Snowflake, AWS, Terraform, Airflow, dbt, Kafka, ELK, Sigma. It works, but it wasn’t built for what’s coming: real‑time client‑facing products at scale. Your first job is to assess it honestly and build a plan.
  • Data infrastructure for decisioning — We have a dedicated squad building our decisioning platform. Your job is to give them the foundation: real‑time data pipelines, feature serving, low‑latency access to the signals they need. You’re not building the decisioning logic—you’re making sure the data is there when it matters.
  • Infrastructure supporting client‑facing data products — Real‑time APIs, embedded analytics, decisioning capabilities that our clients integrate into their own workflows. Not dashboards—products.
  • AI‑ML platform layer — Our data scientists need to deploy and monitor models in production without hand‑holding. You’ll build the infrastructure that makes that possible.
  • The team — You’ll hire and grow the engineering side as we scale.
What do you need:

Technical:

  • 5+ years in data engineering; 2+ leading a team or major workstream
  • You’ve built production data pipelines (Airflow, Kafka, or similar) and know the difference between “works in dev” and “works at scale”
  • Establish and own CI/CD practices to ensure quality, reliability, and fast delivery of data solutions.
  • Own data quality and proactive monitoring, identify surface issues and communicate them clearly before other teams find them.
  • Strong SQL and Python. We’ll ask you to prove it.
  • Strong open table formats knowledge and modern data architecture experience.
  • Experience with Snowflake or equivalent cloud warehouse.
  • Experience with AWS.
  • You’ve shipped data products to external users, not just internal stakeholders.

Leadership and Business Alignment:

  • Lead and develop the team, fostering high performance and professional growth.
  • Provide ongoing mentorship, feedback, and establish best practices.
  • Collaborate closely with business teams to understand priorities and requirements.
  • Align technical strategy and execution with business roadmaps across the organization.
  • Act as a bridge between business and technology, ensuring focus on impact and value delivery.
  • Build processes and documentation that enable teams to self‑serve on the data platform, reducing hand‑holding and dependency.
  • Stakeholder management across engineering, risk, finance, and client success teams

Important but learnable:

  • English (team is bilingual)
  • Understanding of payment rails, reconciliation, and settlement processes
  • Knowledge of treasury operations and liquidity management data
  • Experience with financial data regulations (PCI‑DSS, data residency requirements in LatAm)
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.