Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

BHFT

Remote

AED 257,000 - 368,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A modern international technology company in the United Arab Emirates seeks a Full-time Data Engineer to architect and manage advanced data systems. The role requires at least 7 years of experience in building production-grade systems, strong Python proficiency, and familiarity with event streams like Kafka. The company offers excellent growth opportunities, remote work flexibility, and compensation for health insurance and professional training.

Benefits

Professional growth opportunities
Compensation for health insurance
Flexible schedule
Support for sports activities and training

Qualifications

  • 7 years of experience building production-grade data systems.
  • Familiar with market data formats like MDP, ITCH, FIX.
  • Hands-on experience with event streams like Kafka.

Responsibilities

  • Architect and manage batchstream pipelines using Airflow and Kafka.
  • Implement S3 storage solutions for analytics at petabyte scale.
  • Develop internal libraries for schema management and data contracts.

Skills

Production-grade data systems
Market data formats understanding
Expert-level Python
Modern orchestration (Airflow)
Strong SQL proficiency
High-throughput API design
Strong Linux fundamentals
Mentoring and code reviews

Tools

Docker
AWSS3
GCS
Job description
Key Responsibilities
  • Ingestion&Pipelines: Architect batchstream pipelines (Airflow Kafka dbt) for diverse structured and unstructured marked data. Provide reusable SDKs in Python and Go for internal data producers.

  • Storage&Modeling: Implement and tune S3 columnoriented and timeseries data storage for petabytescale analytics; own partitioning compression TTL versioning and cost optimisation.

  • Tooling & Libraries: Develop internal libraries for schema management data contracts validation and lineage; contribute to shared libraries and services for internal data consumers for research backtesting and real-time trading purposes.

  • Reliability & Observability: Embed monitoring alerting SLAs SLOs and CI/CD; champion automated testing data quality dashboards and incident runbooks.

  • Collaboration: Partner with Data Science QuantResearch Backend and DevOps to translate requirements into platform capabilities and evangelise best practices.

Qualifications
Required Skills & Experience
  • 7years building productiongrade data systems.
  • Familiarity with market data formats (e.g. MDP ITCH FIX proprietary exchange APIs) and market data providers.
  • Expertlevel Python (Go and C nice to have).
  • Handson with modern orchestration (Airflow) and event streams (Kafka).
  • Strong SQL proficiency: aggregations joins subqueries window functions (first last candle histogram) indexes query planning and optimization.
  • Designing highthroughput APIs (REST/gRPC) and data access libraries.
  • Strong Linux fundamentals containers (Docker) and cloud object storage (AWSS3 / GCS).
  • Proven track record of mentoring code reviews and driving engineering excellence.
Additional Information

What we offer:

  • Working in a modern international technology company without bureaucracy legacy systems or technical debt.
  • Excellent opportunities for professional growth and self-realization.
  • We work remotely from anywhere in the world with a flexible schedule.
  • We offer compensation for health insurance sports activities and professional training.
Remote Work

Yes

Employment Type

Full-time

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.