Ativa os alertas de emprego por e-mail!

Senior Data Engineer Team Lead - OP01764

Dev.Pro

São Paulo

Teletrabalho

USD 60.000 - 90.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Melhora as tuas possibilidades de ir a entrevistas

Cria um currículo adaptado à oferta de emprego para teres uma taxa de sucesso superior.

Resumo da oferta

A leading company seeks a Senior Data Engineer to spearhead a data modernization effort. The role involves leading a team and working with advanced cloud technologies to optimize data processes. This is an exciting opportunity for experienced professionals looking to make a significant impact in data engineering while enjoying a robust remote work environment.

Serviços

30 paid days off per year
5 paid sick days
Health insurance after probation
Wellness bonus
English lessons

Qualificações

  • 5+ years in data engineering and data warehouse modeling.
  • Expert-level SQL skills and experience in Snowflake and Apache Iceberg tables.
  • Upper-Intermediate+ English level required.

Responsabilidades

  • Lead the Data Engineering team in migrating data pipelines.
  • Design and optimize data pipelines to the cloud environment.
  • Act as a liaison between the technical team and the client.

Conhecimentos

Data engineering
ETL design
SQL
Python
Cloud services
Leadership
Communication

Formação académica

Degree in Computer Science
Degree in Data Engineering
Degree in Information Systems

Ferramentas

GCP
Snowflake
Apache Iceberg
dbt
Airflow

Descrição da oferta de emprego

???? Are you in Brazil, Argentina or Colombia? Join us as we actively recruit in these locations, offering a comfortable remote environment. Submit your CV in English, and we'll get back to you!

We invite a Senior Data Engineer to play a key part in a large-scale data modernization effort for a major enterprise client. You’ll lead the Data Engineering team in migrating and transforming complex legacy data pipelines to a modern custom-built cloud environment for improved scalability, maintainability, and compliance. You’ll also collaborate closely with architects, DevOps, QA, and product stakeholders to deliver scalable, reliable data solutions that meet unique business needs.

???? What's in it for you:

  • Join a fully integrated delivery team built on collaboration, transparency, and mutual respect
  • Lead a skilled Data Engineering team through a high-impact data platform transformation in a production environment
  • Work hands-on with modern, in-demand technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery• Hands on experience with Google Landing Zones

Is that you?

  • 5+ years in data engineering and data warehouse modeling
  • Strong proficiency in designing and building ETL for large data volumes and streaming solutions
  • Expert-level SQL skills and experience in Snowflake and Apache Iceberg tables
  • Hands-on experience with GCP services (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub)
  • Proficiency in Python for ETL scripting and DAG development
  • Experience using dbt for data transformation and orchestration
  • Familiarity with CI/CD processes and tools (Git, Terraform, Serverless)
  • Degree in Computer Science, Data Engineering, Information Systems, or related fields
  • Strong leadership skills with proven experience guiding technical teams
  • Strong communication and collaboration abilities
  • Upper-Intermediate+ English level

Desirable:

  • Experience building and managing streaming data pipelines and event-driven architectures
  • Experience writing Bash scripts
  • Experience with Java for Dataflow jobs
  • Familiarity with data lakehouse architectures using Iceberg tables
  • Proficiency with Docker for containerizing data pipelines and supporting orchestration
  • Familiarity with AI-assisted tools like GitHub Copilot

Key responsibilities and your contribution

In this role, you’ll combine hands-on engineering work with team leadership responsibilities to drive the team’s success and ensure smooth and timely project delivery.

  • Review and analyze existing ETL solutions for migration to the new architecture
  • Design, optimize, and migrate batch and streaming data pipelines to the GCP Landing Zone
  • Build and manage data transformations with dbt, supporting ELT pipelines in Snowflake
  • Ensure the new data infrastructure meets performance and quality SLAs/SLOs
  • Implement monitoring and alerting for pipelines to ensure system fault tolerance
  • Develop migration scripts to transfer historical data to Iceberg tables
  • Act as a liaison between the technical team and the client to ensure clear communication
  • Break down complex tasks into smaller, manageable technical deliverables for the team
  • Proactively identify risks and take steps to mitigate them

What's working at Dev.Pro like?

Dev.Pro is a global company that's been building great software since 2011. Our team values fairness, high standards, openness, and inclusivity for everyone — no matter your background

We are 99.9% remote — you can work from anywhere in the world
Get 30 paid days off per year to use however you like — vacations, holidays, or personal time
️ 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child
️ Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
We pay in U.S. dollars and cover all approved overtime
Join English lessons and Dev.Pro University programs, and take part in fun online activities and team-building events

Our next steps:

Submit a CV in English — Intro call with a Recruiter — Internal interview — Client interview — Offer

Interested? Find out more:

How we work

LinkedIn Page

Our website

IG Page

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.