Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer_L3_ Program Tech Lead

Grupo Data

Teletrabalho

BRL 80.000 - 120.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading data solutions company seeks an experienced Tech Lead to guide the migration of data pipelines to Databricks. The ideal candidate will have significant expertise in data engineering, ETL orchestration, and proficiency in programming with Python. You will be responsible for optimizing workflows, collaborating with teams, and mentoring junior engineers. This role is remote and offers the opportunity to lead innovative projects within a dynamic environment.

Qualificações

  • Minimum of 4 years of professional experience in data engineering or similar role.
  • Over 7 years of experience with ETL orchestration and workflow management tools.
  • At least 4 years of experience with distributed data ecosystems.

Responsabilidades

  • Lead the migration of data pipelines from Snowflake to Databricks.
  • Design and develop ETL workflows and data pipelines.
  • Ensure high performance and reliability of data systems.

Conhecimentos

English - C1
Data engineering experience
Proficiency in Python
ETL orchestration
Tableau / SQL expertise
Leadership and communication skills

Ferramentas

Airflow
Flink
Oozie
Azkaban
AWS
GCP
Spark
Hive
Druid
Presto
Snowflake
Redshift
PostgreSQL
Descrição da oferta de emprego
Program Tech Lead Databricks

Highly skilled Tech Lead to spearhead Snowflake and Databricks migration pipelines. The ideal candidate needs to have extensive experience in data engineering, ETL orchestration, and database management, with a strong proficiency in programming and distributed computing.

Key Responsibilities
  • Lead the migration of data pipelines from Snowflake to Databricks.
  • Design, develop, and optimize ETL workflows and data pipelines.
  • Collaborate with cross‑functional teams to understand database requirements and ensure successful migration.
  • Implement best practices for data engineering and ensure high performance and reliability of data systems.
  • Identify opportunities to optimize and reduce costs associated with data storage and processing.
Skills
  • Very good English - C1
  • Minimum of 4 years of professional experience in data engineering, business intelligence, or a similar role.
  • Proficiency in programming languages such as Python.
  • Over 7 years of experience in ETL orchestration and workflow management tools like Airflow, Flink, Oozie, and Azkaban using AWS or GCP.
  • Expertise in database fundamentals, Tableau or SQL, and distributed computing.
  • At least 4 years of experience with distributed data ecosystems (Spark, Hive, Druid, Presto).
  • Experience working with Snowflake, Redshift, PostgreSQL, Tableau and/or other DBMS platforms.
  • Lead and mentor a team of engineers, fostering a collaborative and productive work environment.
  • Apply Scrum methodologies to manage project workflows and deliverables efficiently.
  • Very good Tableau / Python/ SQL - It will be validated with real‑time coding tests.
Note

Strong leadership and communication skills to manage and guide a team of engineers.

Details

Full remote

Sector: Communication Services

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.