Ativa os alertas de emprego por e-mail!

Mid - Senior Data Engineer

Belvo

São Paulo

Híbrido

BRL 80.000 - 120.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Resumo da oferta

A modern open finance platform is looking for a Senior Data Engineer in São Paulo. The role focuses on architecting and developing data infrastructure to support data pipelines and improve data discoverability. Candidates should have AWS experience and a strong background in data engineering practices with tools like Terraform and dbt. The position offers flexibility and the opportunity to work on challenging projects involving large datasets.

Serviços

Stock options
Annual company bonus
Flexible working hours
Remote friendly
Health Insurance
Training Budget
Yearly offsite

Qualificações

  • At least 2 years of experience with data engineering platforms on AWS cloud.
  • Experience with dbt and building infrastructure as code.
  • Experience with data catalogs and data lineage.

Responsabilidades

  • Architect and develop infrastructure for data pipelines.
  • Define and develop the data platform in collaboration with stakeholders.
  • Maintain and evolve data platform products.

Conhecimentos

Data engineering platforms
Orchestrators like Dagster or Airflow
Fluent in English
Integrating third-party APIs

Ferramentas

AWS
DBT
Terraform
Spark
Descrição da oferta de emprego
Overview

A little bit about us:

We are Belvo, an open finance API platform with the bold vision of democratizing access to financial services in Latin America. We enable any financial innovator to access and interpret financial data, as well as initiate payments from their end-users accounts. We’re turning the messy complexities of the Latin American financial ecosystem into a modern set of tools to access and interpret data and move money in a seamless and secure way.

We’re a highly-technical, passionate, and driven team. We are more than 90 people and our team currently represents 20 nationalities. We have offices in São Paulo and Mexico City – while a large portion of us work remotely.

We are tackling a very stimulating problem: connecting fintech innovators with legacy financial infrastructure. We strive to go beyond the limits of what is possible today and to do so in an elegant and developer-first way.

Since starting our adventure in May 2019, we have raised $71m from the leading VC investors globally.

You can read more about our company here and about our team and culture here. Also, head to our blog for more news about what we’re building and how we work.

About the team
  • We work in cross-functional, autonomous teams. We follow continuous delivery best practices executed on top of a modern technology stack.
  • Our products are built for developers, by developers. Technological excellence is at the heart of what we do.
  • We are pragmatic and customer-focused. We strive to find the right set of trade-offs in order to validate our hypothesis as early as possible, iterating on our products based on customer feedback.
  • We communicate transparently. We do weekly all-hands where we get together to discuss company performance and goals.
  • While we are global and remote-friendly, we also operate from our vibrant offices in CDMX and Sao Paulo. To accommodate the various time zones in which we are based, we ensure we’re always synced up between 3 pm and 6 pm, CEST.
  • Also, we are backed by some of the leading investors in Silicon Valley and Latin America, including Founders Fund, Kaszek Ventures, and YCombinator.
Your opportunity

We’re looking for a seasoned Senior Data Engineer to join our Data Platform team. This team has the goal to support data understanding at scale by architecting and developing the infrastructure to develop data pipelines, move and transform complex datasets from different sources, and improve data discoverability and data literacy. The ideal candidate is a player who is sought out for technical guidance and can act as an owner for projects across the company. Ideally has been building data infrastructure and is familiar with Data Mesh concepts.

As part of the team, you will be in contact with our stakeholders, ranging from data insights teams of analysts to deeply technical backend product teams to better define and develop the platform. You will be a central part of the roadmap definition for the team. You will have full ownership of some projects and will have the opportunity to define new data platform products. The current platform uses the latest technologies, like EMR Studio and Apache Iceberg, and as part of the team, you will also be responsible for maintaining and evolving it.

Our platform infrastructure is fully defined with Terraform and we are processing over a thousand events per second. We run daily processes that read over 40 terabytes of data using dbt over Athena and Spark on EMR clusters, everything orchestrated with Dagster. We are moving some of our processes to Streaming processing using Kinesis and Flink.

This position may be for you if
  • You have at least 2 years of experience with data engineering platforms on the AWS cloud
  • You’re fluent in English
  • You’re familiar with orchestrators like Dagster or Airflow
  • You have previous experience with dbt
  • You enjoy a good challenge dealing with billions of events
  • You have experience integrating third party APIs
  • You love to focus on getting things done
Amazing if
  • You have experience in data engineering, building infrastructure as code in the cloud for a data platform with Terraform.
  • You have experience with DBT and Great Expectations
  • You have experience with Spark, either with Scala or Python
  • You have experience in some of this AWS tools: EMR, DMS, Glue, Kinesis, Redshift
  • You have experience with data catalogs and data lineage
Our tech stack
  • We’re building our platform using modern technologies, focusing on reliability and long-term maintainability
  • We primarily use Python on the backend with Django and Python’s asyncio in parts of our stack
  • We use Javascript, Vue.js and Sass on the frontend with our own design system and component library
  • We run our infrastructure on AWS using managed services to focus on business problems
  • We observe and monitor our services using Datadog
  • We follow Continuous Integration and Continuous Delivery best practices
Our process steps

At Belvo every hire is so important to us, that we share the decision to hire as a team.

  • People team chat
  • Take-home challenge
  • Challenge presentation
  • Meet the founders
Our perks
  • Stock options (we are all owners and this is very important to us)
  • Annual company bonus linked to company performance
  • Flexible working hours
  • Remote friendly
  • Pet friendly
  • Health Insurance
  • Paid time off on your birthday
  • Renew your laptop every 3 years
  • Training Budget
  • Team building events
  • Bank holidays swap inside the same month
  • Fitness/ wellness stipends
  • Yearly offsite
  • Fresh fruit every week, all-you-can-drink tea and coffee
  • Extra days off when completing company anniversary
  • Yearly Company offsite
  • Yearly Department offsite
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.