Ativa os alertas de emprego por e-mail!

Data Engineer_L3_ Program Tech Lead

Grupo Data

São Paulo

Teletrabalho

BRL 80.000 - 120.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Resumo da oferta

A technology services provider in São Paulo is seeking a Data Engineer_L3_ Program Tech Lead to lead the migration of data pipelines from Snowflake to Databricks. The role requires extensive experience in data engineering, ETL orchestration, and database management, along with strong programming skills. The ideal candidate will have a proven track record in managing teams and applying Scrum methodologies to enhance project workflows. This position offers a fully remote work model.

Qualificações

  • Minimum of 4 years of professional experience in data engineering, business intelligence, or a similar role.
  • Experience working with Snowflake, Redshift, PostgreSQL or other DBMS platforms.
  • Strong leadership and communication skills.

Responsabilidades

  • Lead the migration of data pipelines from Snowflake to Databricks.
  • Design, develop, and optimize ETL workflows and data pipelines.
  • Collaborate with cross-functional teams to understand database requirements.

Conhecimentos

Very good English - C1
Data engineering
ETL orchestration
Programming (Python)
Database management
Distributed computing
Tableau
Scrum methodologies

Ferramentas

Airflow
Flink
Oozie
Azkaban
AWS
GCP
Snowflake
Redshift
PostgreSQL
Descrição da oferta de emprego
Overview

About the job: Data Engineer_L3_ Program Tech Lead

Description: Program Tech Lead Databricks. Highly skilled Tech Lead to spearhead Snowflake and Databricks migration pipelines. The ideal candidate needs to have extensive experience in data engineering, ETL orchestration, and database management, with a strong proficiency in programming and distributed computing.

Responsibilities
  • Lead the migration of data pipelines from Snowflake to Databricks.
  • Design, develop, and optimize ETL workflows and data pipelines.
  • Collaborate with cross-functional teams to understand database requirements and ensure successful migration.
  • Implement best practices for data engineering and ensure high performance and reliability of data systems.
  • Identify opportunities to optimize and reduce costs associated with data storage and processing.
Qualifications
  • Very good English - C1
  • Minimum of 4 years of professional experience in data engineering, business intelligence, or a similar role.
  • Proficiency in programming languages such as Python.
  • Over 7+ years of experience in ETL orchestration and workflow management tools like Airflow, Flink, Oozie, and Azkaban using AWS or GCP.
  • Expertise in database fundamentals, Tableau or SQL, and distributed computing.
  • At least 4 years of experience with distributed data ecosystems (Spark, Hive, Druid, Presto).
  • Experience working with Snowflake, Redshift, PostgreSQL, Tableau and/or other DBMS platforms.
  • Lead and mentor a team of engineers, fostering a collaborative and productive work environment.
  • Apply Scrum methodologies to manage project workflows and deliverables efficiently.
  • Very good Tableau / Python/ SQL - It will be validated with real-time coding tests.
  • Minimum 4 years experience in the technologies of the role.
Notes
  • Strong leadership and communication skills to manage and guide a team of engineers.

Remote: Full remote

Sector: Communication Services

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.

Ofertas semelhantes