Job Search and Career Advice Platform

Enable job alerts via email!

AWS Glue Data Engineer

Deeplight

United Arab Emirates

Hybrid

AED 220,000 - 294,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading AI consultancy firm is looking for an experienced data engineer to build and optimize scalable ingestion pipelines using AWS Glue and other related technologies. Responsibilities include implementing ETL pipelines, collaborating with platform teams, and ensuring compliance with data contracts. The ideal candidate has over 5 years of experience in data engineering, proficient in AWS Glue and PySpark. This role offers competitive salary, health insurance, and flexible working arrangements, with opportunities for professional development.

Benefits

Competitive salary and performance bonuses
Comprehensive health insurance
Professional development support
Flexible working arrangements
Career advancement opportunities

Qualifications

  • 5+ years’ experience in data engineering roles with hands-on experience in AWS Glue.
  • Proficiency in developing scalable ETL pipelines.
  • Strong knowledge of data quality frameworks.

Responsibilities

  • Build scalable ingestion pipelines using AWS Glue.
  • Automate migration of source tables to Bronze layer.
  • Ensure jobs comply with data contracts and monitoring standards.

Skills

AWS Glue
PySpark
ETL pipeline development
Lakehouse architecture
Git
Terraform
Data quality frameworks
Strong communication skills
Agile methodologies
Jira

Tools

AWS services (S3, Athena, Lambda)
Job description
Overview

DeepLight AI is a specialist AI and data consultancy with extensive experience implementing intelligent enterprise systems across multiple industries, with particular depth in financial services and banking.

Responsibilities
  • Build scalable ingestion pipelines using AWS Glue, PySpark, and ETL pipeline development.
  • Implement AWS Glue jobs for Bronze layer ingestion, applying defined standards and templates.
  • Design and execute historical loading mechanisms to bring legacy data into the Lakehouse.
  • Optimize Glue job performance (DPU allocation, parallelization, partitioning) according to best practices.
  • Collaborate with platform teams to ensure tooling and optimization alignment.
  • Automate migration of source tables to Bronze layer, leveraging AI-enabled acceleration.
  • Version-control jobs and automate production deployment via Git and Terraform.
  • Establish source system connectivity into CDP in collaboration with source system owners.
  • Ensure jobs comply with data contracts and monitoring standards.
  • Prepare documentation and handover to operational support teams.
  • Work closely with Data Architect for ingestion patterns and standards.
  • Coordinate with Data Assurance Lead to apply quality checks across all jobs.
  • Partner with platform engineers for tooling and optimisation.
Qualifications
  • 5+ years’ experience in data engineering roles, with hands‑on experience in AWS Glue.
  • Proficiency in AWS Glue, PySpark, and ETL pipeline development.
  • Substantial knowledge of Lakehouse architecture and Medallion design principles.
  • Familiarity with CDC, delta loads, and historical data ingestion strategies.
  • Experience with AWS services: Glue, S3, Athena, Lambda.
  • Competence in Git and Terraform for CI/CD automation.
  • Knowledge of data quality frameworks (e.g., Soda Core).
  • Strong communication skills and ability to engage with various stakeholder levels.
  • Experience working in fast‑paced environments and delivering aggressive migration targets.
  • Familiarity with Jira and agile methodologies.
Benefits & Growth Opportunities
  • Competitive salary and performance bonuses
  • Comprehensive health insurance
  • Professional development and certification support
  • Opportunity to work on cutting‑edge AI projects
  • Flexible working arrangements
  • Career advancement opportunities in a rapidly growing AI company

We are committed to fostering an inclusive environment where individuals with different thinking styles can thrive and contribute their unique strengths to our specialised AI and data solutions. All candidates are encouraged to request reasonable adjustments to the application process, should the need arise.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.