Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Deeplight AI

Abu Dhabi

On-site

AED 120,000 - 200,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading AI consultancy in Abu Dhabi is seeking a Data Engineer to design and optimise data pipelines supporting advanced AI systems. The successful candidate will work closely with a multidisciplinary team, ensuring efficient data collection, storage, and processing. Responsibilities include implementing data governance frameworks and CI/CD for machine learning. Ideal candidates will have expertise in Lakehouse architectures and cloud platforms, along with strong analytical skills. This role offers competitive salary, health insurance, and career growth opportunities.

Benefits

Competitive salary and performance bonuses
Comprehensive health insurance
Professional development support
Flexible working arrangements
Career advancement opportunities

Qualifications

  • Proven experience in Data Lakehouse architectures.
  • Hands-on experience with big data technologies.
  • Expertise in SQL and scripting languages like Python.
  • Experience implementing MLOps pipelines.
  • Strong cloud platform experience (AWS, Azure, GCP).

Responsibilities

  • Design and optimise data solutions using Lakehouse architecture.
  • Maintain data pipelines for diverse datasets.
  • Implement standards for data governance.
  • Automate ML lifecycle with CI/CD.
  • Monitor model performance metrics.

Skills

Data Lakehouse architectures
Data ingestion management
Big Data technologies
ETL/ELT processes
MLOps principles
CI/CD tools
Cloud platforms (AWS, Azure, GCP)
Analytical and problem-solving skills

Tools

Databricks
Spark
Kafka
Docker
Kubernetes
TensorFlow Serving
MLflow
Job description

DeepLight AI is a specialist AI and data consultancy with extensive experience implementing intelligent enterprise systems across multiple industries, with particular depth in financial services and banking. Our team combines deep expertise in data science, statistical modeling, AI/ML technologies, workflow automation, and systems integration with a practical understanding of complex business operations.

The Data Engineer is responsible for designing, implementing, and optimising data pipelines and infrastructure to support our cutting‑edge AI systems. The Data Engineer collaborates closely with our multidisciplinary team to ensure the efficient collection, storage, processing, and analysis of large‑scale data, enabling us to unlock valuable insights and drive innovation across various domains.

Responsibilities of the role
  • Design, build, and optimise scalable data solutions, primarily utilising the Lakehouse architecture to unify data warehousing and data lake capabilities. Advise stakeholders on the strategic choice between Data Warehouse, Data Lake, and Lakehouse architectures based on specific business needs, cost, and latency requirements.
  • Design, develop, and maintain scalable and reliable data pipelines to ingest, transform, and load diverse datasets from various sources, including structured and unstructured data, streaming data, and real‑time feeds.
  • Implement standards and tooling to ensure ACID properties, schema evolution, and high data quality within the Lakehouse environment. Implement robust data governance frameworks (security, privacy, integrity, compliance, auditing).
  • Continuously optimise data storage, compute resources, and query performance across the data platform to reduce costs and improve latency for both BI and ML workloads, leveraging techniques such as indexing, partitioning, and parallel processing.
  • Develop and maintain CI/CD pipelines to automate the entire machine learning lifecycle, from data validation and model training to deployment and infrastructure provisioning.
  • Deploy, manage, and scale machine learning models into production environments, utilising MLOps principles for reliable and repeatable operations.
  • Establish and manage monitoring systems to track model performance metrics, detect data drift (changes in input data), and model decay (degradation in prediction accuracy).
  • Ensure rigorous version control and tracking for all components: code, datasets, and trained model artifacts (using tools like MLflow or similar).
  • Create comprehensive documentation, including technical specifications, data flow diagrams, and operational procedures, to facilitate understanding, collaboration, and knowledge sharing.
Requirements
  • Proven practical experience in designing, building, and optimising solutions using Data Lakehouse architectures (e.g., Databricks, Delta Lake).
  • Strong hands‑on experience with managing data ingestion, schema enforcement, ACID properties, and utilising big data technologies/frameworks like Spark and Kafka.
  • Expertise in data modeling, ETL/ELT processes, and data warehousing concepts. Proficiency in SQL and scripting languages (e.g., Python, Scala).
  • Demonstrated practical experience implementing MLOps pipelines for production systems. This includes a solid understanding and implementation experience with MLOps principles: automation, governance, and monitoring of ML models throughout the entire lifecycle.
  • Experience with CI/CD tools, containerisation/orchestration technologies (e.g., Docker, Kubernetes), model serving frameworks (e.g., TensorFlow Serving, Sagemaker), and experiment tracking (e.g., MLflow).
  • Experience with production monitoring tools to detect data drift or model decay.
  • Strong hands‑on experience with major cloud platforms (e.g., AWS, Azure, GCP) and familiarity with DevOps practices.
  • Excellent analytical, problem‑solving, and communication skills, with the ability to translate complex technical concepts into clear and actionable insights.
  • Proven ability to work effectively in a fast‑paced, collaborative environment, with a passion for innovation and continuous learning.
Benefits & Growth Opportunities
  • Competitive salary and performance bonuses
  • Comprehensive health insurance
  • Professional development and certification support
  • Opportunity to work on cutting‑edge AI projects
  • Flexible working arrangements
  • Career advancement opportunities in a rapidly growing AI company

This position offers a unique opportunity to shape the future of AI implementation while working with a talented team of professionals at the forefront of technological innovation. The successful candidate will play a crucial role in driving our company's success in delivering transformative AI solutions to our clients.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.