Enable job alerts via email!

Senior Data Engineer

Property Finder Group

Dubai

On-site

AED 120,000 - 200,000

Full time

Today
Be an early applicant

Job summary

A leading property portal in the MENA region is seeking an experienced Senior Data Engineer to develop and optimize data solutions. Responsibilities include maintaining data pipelines, ensuring data quality, and collaborating with teams for scalable data architecture. Ideal candidates will have strong skills in SQL, Python, and AWS, with over 7 years of experience in data engineering.

Qualifications

  • Proficient in building automation solutions to improve efficiency.
  • Deep expertise in AWS services and data warehouse concepts.
  • Experienced in building both batch and streaming pipelines.

Responsibilities

  • Maintain and optimize end-to-end data pipelines and integrations.
  • Actively participate in architecture discussions.
  • Ensure the highest standards of data quality and governance.

Skills

SQL
Python
Java
Scala
Spark
PySpark
AWS
Kafka
Kinesis
Terraform

Education

7+ years working as a Data Engineer

Tools

Snowflake
Redshift
Clickhouse
Dagster
FiveTran
DBT
AWS Glue
Job description
Overview

Property Finder is the leading property portal in the Middle East and North Africa (MENA) region, dedicated to shaping an inclusive future for real estate while spearheading the region’s growing tech ecosystem. At its core is a clear and powerful purpose: To change living for good in the region. Founded on the value of great ambitions, Property Finder connects millions of property seekers with thousands of real estate professionals every day. The platform offers a seamless and enriching experience, empowering both buyers and renters to make informed decisions. Since its inception in 2007, Property Finder has evolved into a trusted partner for developers, brokers, and home seekers. As a lighthouse tech company, it continues to create an environment where people can thrive and contribute meaningfully to the transformation of real estate in MENA.

Reports To:

Summary

As a Senior Data Engineer, you will be responsible for the end-to-end design, development, and maintenance of data pipelines and integrations, ensuring reliability, scalability, and performance across all data solutions.

Responsibilities
  • Maintain and optimize end-to-end data pipelines and integrations with a strong focus on performance and cost efficiency.
  • Actively participate in architecture discussions, advising on best practices and contributing to the design of scalable and future-proof solutions.
  • Optimise and fine-tune SQL and/or Python code for performance.
  • Design and maintain data models for efficient storage and retrieval.
  • Build innovative and optimized data solutions, thinking outside the box while avoiding legacy approaches and ensuring consistency across systems.
  • Contribute to the development and maintenance of the infrastructure required to support advanced data engineering processes.
  • Collaborate with cross-functional teams to design and implement self-service data layers, empowering stakeholders with accessible, high-quality data.
  • Ensure the highest standards of data quality, reliability, and governance in all deliverables.
  • Support team members in their work and act as a mentor when needed, fostering knowledge sharing and professional growth.
  • Manage stakeholders effectively, ensuring clear, transparent communication and alignment on priorities.
  • Demonstrate strong prioritization skills, balancing short-term deliverables with long-term strategic initiatives to maximize impact.
Impact of the Role

In this position as a Data Engineer, you will drive the development of sophisticated data solutions that support business growth and innovation. Your expertise in data architecture and engineering will directly align technical solutions with broader business strategies, while your focus on innovation will help the team deliver impactful and sustainable outcomes.

Qualifications

Languages: Java/Scala, SQL & Python

Message Queues: Kafka, Kinesis Data Streams

Data stores: Snowflake, Redshift, Clickhouse, S3

Pipeline orchestration tool: Dagster (Legacy: Airflow)

PaaS: AWS (ECS/EKS, DMS, Kinesis, Glue, Athena, S3 and others.)

ETL: FiveTran & DBT for transformation

IaC: Terraform (with Terragrunt)

  • 7+ years working as a Data Engineer.
  • Advanced knowledge of SQL, Python, Spark/PySpark, and data warehouse concepts and modeling.
  • Experienced in designing and building data solutions to address complex business problems.
  • Deep expertise in AWS services, delivering scalable solutions and leveraging serverless architectures.
  • Skilled in building automation solutions to reduce repetitive tasks and improve efficiency.
  • Solid understanding of data warehousing, dimensional modeling, and industry best practices.
  • Proficient with orchestration tools such as Dagster, Airflow, and AWS Step Functions.
  • Proficient with ETL/ELT tools including AWS Glue, dbt, Informatica, and Talend.
  • Knowledgeable in working with diverse data stores such as MySQL, MongoDB, DynamoDB, and Aurora.
  • Experienced in building both batch and streaming pipelines.
  • Familiar with applying Generative AI services to enhance and build data-driven solutions.
  • Strong understanding of CI/CD pipelines and automation for data workflows.
Other Desired Experience
  • Experience with Dubai Land Department or experience within Real Estate is desirable
  • Experience with modern cloud data warehousing, data lake solutions like Snowflake, BigQuery, Redshift.
  • Experience with AWS services (like S3, DMS, Glue, Athena, EKS, etc.)
  • Proven experience with data warehousing and dimensional data modelling guidelines and best practices
  • Familiar with ETL tools like FiveTran, DBT, Airbyte, etc
  • Experience with orchestration tools like Dagster, Airflow, AWS Step functions, etc.
  • Knowledge of CI/CD pipelines and automation.
  • Experience with Terraform and Terragrunt.
  • Experience with data stores like MySQL, MongoDB, DynamoDB, Aurora.
  • Familiar with BI tools like Tableau, QuickSight, Tableau, PowerBI, MicroStrategy, etc
  • Familiar with Tagging and tracking tools like Snowplow, Tealium,Segment.io etc
  • Experience with Realtime analytics solutions like Clickhouse, Rockset, Pinot, or Druid.
  • Experience with GCP and Google Analytics.
Our promise to talent

At Property Finder, we believe talent thrives in an environment where you can be your best self. Where you are empowered to create, elevate, grow, and care. Our team is made up of the best and brightest, united by a shared ambition to change living for good in the region. We attract top talent who want to make an impact. We firmly believe that when our people grow, we all succeed.

Property Finder Guiding Principles
  • Think Future First
  • Data Beats Opinions, Speed Beats Perfection
  • Our People, Our Power
  • The Biggest Risk is Taking no Risk at All

Interested in building your career at Property Finder? Get future opportunities sent straight to your email.

Accepted file types: pdf, doc, docx, txt, rtf

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.