Enable job alerts via email!

Senior Data Engineer

Dicetek LLC

Abu Dhabi

On-site

AED 120,000 - 200,000

Full time

5 days ago
Be an early applicant

Job summary

A leading technology solutions provider in Abu Dhabi is seeking a Data Engineering Lead to oversee ETL/ELT processes and Azure Data Services. This role requires extensive experience in Data Engineering, with a focus on Azure Data Factory and Databricks. The ideal candidate will have a Bachelor's degree in Computer Science, strong skills in Python and SQL, and a proven ability to lead teams. Competitive salary and benefits offered.

Qualifications

  • 5+ years of experience in Data Engineering.
  • 5+ years of ETL development experience using Talend or Azure Data Services.
  • 3+ years of hands-on experience in Python development.

Responsibilities

  • Manage ETL processes and Azure Data Services.
  • Design data models and develop comprehensive data solutions.
  • Lead and mentor team members in data engineering activities.

Skills

ETL methodologies
Big Data
Data Warehousing
Python
SQL
Azure Data Services

Education

Bachelor’s Degree in Computer Science

Tools

Azure Data Factory
Azure Databricks
Talend
Snowflake
Job description
Overview

The Data Engineering Lead is responsible for managing the development and support of internally built or supported ETL/ELT processes and Azure Data Services, including Azure Data Factory, Azure Databricks, Big Data, and streaming solutions. This role involves gathering business requirements, designing data models, and developing comprehensive data solutions.

Responsibilities
Data Engineering (ETL/ELT & Data Processing)
  • Analyze, understand, and document business requirements to design data-driven solutions that align with organizational objectives.
  • Expertise in ETL methodologies, Big Data, and Data Warehousing principles, including architecture design, data modeling, and performance optimization.
  • Design, develop, test, optimize, and deploy ETL pipelines using Azure Data Factory (ADF), Azure Databricks, and Azure Data Lake Storage.
  • Familiarity with leading ETL tools such as Talend, Informatica, and Pentaho for building scalable ETL/ELT workflows is a plus.
  • Translate source-to-target mapping documents into efficient ETL processes to support data integration and transformation.
  • Develop and maintain robust ETL code and stored procedures using SQL, Python, and Spark.
Cloud & Data Warehousing
  • Design, implement, and manage ETL processes on Microsoft Azure Cloud, with expertise in Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, and Azure Data Lake.
  • Administer and maintain cloud-based data platforms, including Azure Cloud and Snowflake Enterprise Data Warehouse, ensuring data security, performance, and availability.
  • Develop and manage Spark jobs within Azure Databricks for large-scale data processing.
CI/CD & DevOps
  • Operate in an Agile development environment, applying best practices for CI/CD using Azure DevOps, GitHub, and automation tools.
  • Automate deployment workflows, manage version control, and ensure continuous integration and delivery of data solutions.
  • Implement DevOps practices for monitoring, testing, and optimizing data pipelines to guarantee high system reliability and performance.
Monitoring & Maintenance
  • Proactively monitor, troubleshoot, and maintain ETL components to ensure data quality, system stability, and timely data delivery.
  • Optimize data workflows for performance, scalability, and cost-efficiency in cloud environments.
Machine Learning & API Integration
  • Deploy machine learning (ML) models in the cloud using Azure Machine Learning and integrate them into data pipelines.
  • Design and integrate APIs to enable seamless data flow across systems and applications.
Leadership & Collaboration
  • Lead, mentor, and support team members in data engineering activities, promoting knowledge sharing and best practices.
  • Collaborate effectively with business stakeholders, data scientists, and IT teams to ensure that technical solutions align with business needs.
Qualifications

Essential:

  • Bachelor’s Degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in Data Engineering.
  • 5+ years of ETL development experience using Talend or Azure Data Services.
  • 5+ years of experience writing complex SQL queries, stored procedures, and views.
  • 3+ years of hands-on experience in Python development.
  • 3+ years of experience with Azure or other cloud-based solutions.
  • Strong proficiency in Python and SQL for data engineering tasks.
  • In-depth expertise in Azure Data Services, including:
  • Azure Data Factory (ADF)
  • Azure Synapse Analytics

Desirable:

  • Advanced certifications in Azure Data Engineering (e.g., Azure Data Factory, Azure ML).
  • Certification in Azure Databricks or any cloud data warehouse (e.g., Snowflake, Redshift).
  • Data Science certification is an advantage.
  • 3+ years of experience with Azure Databricks or similar cloud data warehousing tools.
  • 2+ years of experience with Azure Data Factory (ADF).
  • Experience with machine learning algorithms, leveraging Python, Azure Databricks, and Azure Machine Learning for advanced analytics.
Technical Expertise
  • Strong hands-on experience with Databricks for end-to-end data processing and transformation.
  • Expertise in data warehousing concepts with practical experience in ETL/ELT development, data processing, and optimization within Azure environments.
  • Proven ability to deploy and manage machine learning models using Azure Machine Learning and related cloud-based solutions.
  • Expertise in managing cloud-based data warehouses, particularly Snowflake and Azure Data Warehouses.
  • Proficiency in designing, writing, and debugging complex SQL queries, stored procedures, views, triggers, and functions in SQL Server, Oracle, and data warehouse environments.
  • Solid background in data modeling, using tools like Erwin to design efficient, scalable data architectures.
  • Hands-on experience with Azure DevOps and GitHub for version control, CI/CD pipelines, and automated deployments.
  • Strong understanding of Master Data Management (MDM) solutions and customer data profiling techniques.
  • Excellent communication and collaboration skills with the ability to work effectively across cross-functional teams.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.