Enable job alerts via email!

Senior Data Engineer

BHFT

Dubai

Remote

AED 120,000 - 200,000

Full time

20 days ago

Job summary

A modern technology company in Dubai is seeking a Data Engineer to architect data ingestion pipelines and develop storage solutions. The ideal candidate has over 6 years of experience in building data systems, strong Python skills, and is proficient with tools like Kafka and SQL. This role offers flexibility for remote work and ample opportunities for professional growth.

Benefits

Flexible work schedule
Compensation for health insurance
Professional training reimbursement

Qualifications

  • 6 years of experience building data systems.
  • Expert-level Python development skills.
  • Hands-on experience with Airflow and Kafka.
  • Advanced SQL skills with query optimization.
  • Experience designing REST/gRPC APIs.

Responsibilities

  • Architect batchstream pipelines for data ingestion.
  • Implement and tune S3 data storage for analytics.
  • Develop internal libraries for schema management.
  • Embed monitoring and alerting in data systems.
  • Collaborate with teams to translate requirements.

Skills

Production-grade data systems
Python
Airflow
Kafka
SQL
REST/gRPC APIs
Docker
AWS S3
Data formatting
Job description

Key
Responsibilities

  • Ingestion&Pipelines:
    Architect batchstream pipelines (Airflow Kafka dbt) for diverse
    structured and unstructured marked data. Provide reusable SDKs in
    Python and Go for internal data
    producers.

  • Storage&Modeling:
    Implement and tune S3 columnoriented and timeseries data storage
    for petabytescale analytics; own partitioning compression TTL
    versioning and cost
    optimisation.

  • Tooling
    & Libraries:
    Develop internal libraries for
    schema management data contracts validation and lineage; contribute
    to shared libraries and services for internal data consumers for
    research backtesting and real-time trading
    purposes.

  • Reliability
    & Observability
    : Embed monitoring alerting
    SLAs SLOs and CI/CD; champion automated testing data quality
    dashboards and incident
    runbooks.

  • Collaboration:
    Partner with Data Science QuantResearch Backend and
    DevOps to translate requirements into platform capabilities and
    evangelise best
    practices.

Qualifications
:

  • 6 years of
    experience building and maintaining production-grade data systems
    with proven expertise in architecting and launching data lakes from
    scratch.
  • Expert-level Python development skills
    (Go and C nice to have).
  • Hands-on experience
    with modern orchestration tools (Airflow) and streaming platforms
    (Kafka).
  • Advanced SQL skills including complex
    aggregations window functions query optimization and
    indexing.
  • Experience designing high-throughput
    APIs (REST/gRPC) and data access
    libraries.
  • Solid fundamentals in Linux
    containerization (Docker) and cloud object storage solutions (AWS
    S3 GCS).
  • Strong knowledge of handling diverse
    data formats including structured and unstructured data with
    experience optimizing storage strategies such as partitioning
    compression and cost management.
  • English at C1
    level - confident communication documentation and collaboration
    within an international
    team.

Additional
Information :

What we
offer:

  • Working in a
    modern international technology company without bureaucracy legacy
    systems or technical debt.
  • Excellent
    opportunities for professional growth and
    self-realization.
  • We work remotely from
    anywhere in the world with a flexible
    schedule.
  • We offer compensation for health
    insurance sports activities and professional
    training.

Remote
Work :

Yes

Employment
Type :

Full-time

Key Skills
Apache
Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data
Warehouse,Kafka,Scala
Department / Functional
Area:
Data Engineering
Experience:
years
Vacancy:
1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.