Overview
About the job: Data Engineer_L3_ Program Tech Lead
Description: Program Tech Lead Databricks. Highly skilled Tech Lead to spearhead Snowflake and Databricks migration pipelines. The ideal candidate needs to have extensive experience in data engineering, ETL orchestration, and database management, with a strong proficiency in programming and distributed computing.
Responsibilities
- Lead the migration of data pipelines from Snowflake to Databricks.
- Design, develop, and optimize ETL workflows and data pipelines.
- Collaborate with cross-functional teams to understand database requirements and ensure successful migration.
- Implement best practices for data engineering and ensure high performance and reliability of data systems.
- Identify opportunities to optimize and reduce costs associated with data storage and processing.
Qualifications
- Very good English - C1
- Minimum of 4 years of professional experience in data engineering, business intelligence, or a similar role.
- Proficiency in programming languages such as Python.
- Over 7+ years of experience in ETL orchestration and workflow management tools like Airflow, Flink, Oozie, and Azkaban using AWS or GCP.
- Expertise in database fundamentals, Tableau or SQL, and distributed computing.
- At least 4 years of experience with distributed data ecosystems (Spark, Hive, Druid, Presto).
- Experience working with Snowflake, Redshift, PostgreSQL, Tableau and/or other DBMS platforms.
- Lead and mentor a team of engineers, fostering a collaborative and productive work environment.
- Apply Scrum methodologies to manage project workflows and deliverables efficiently.
- Very good Tableau / Python/ SQL - It will be validated with real-time coding tests.
- Minimum 4 years experience in the technologies of the role.
Notes
- Strong leadership and communication skills to manage and guide a team of engineers.
Remote: Full remote
Sector: Communication Services