Enable job alerts via email!

Senior Data Engineer

BHFT

Dubai

Remote

AED 120,000 - 200,000

Full time

Today
Be an early applicant

Job summary

A leading technology firm in Dubai is seeking a Data Engineer to architect batchstream pipelines and implement data storage solutions. The role requires at least 7 years of experience, a strong proficiency in Python, and familiarity with market data formats. Join us for a flexible remote work environment and excellent opportunities for professional growth. Benefits include health insurance and compensation for sports activities.

Benefits

Health insurance compensation
Flexible schedule
Professional training

Qualifications

  • 7 years building production-grade data systems.
  • Familiarity with market data formats (e.g. MDP ITCH FIX proprietary exchange APIs).
  • Hands-on with modern orchestration (Airflow) and event streams (Kafka).

Responsibilities

  • Architect batchstream pipelines for diverse data.
  • Implement and tune S3 storage for analytics.
  • Develop internal libraries for data management.
  • Embed monitoring and CI/CD for reliability.

Skills

Production-grade data systems
Market data formats
Expert level Python
Event streams (Kafka)
Strong SQL proficiency
High-throughput APIs
Strong Linux fundamentals
Mentoring and code reviews

Tools

Apache Hive
S3
Hadoop
Redshift
Spark
AWS
Apache Pig
NoSQL
Kafka
Scala
Job description
Key Responsibilities
Ingestion & Pipelines

Architect batchstream pipelines (Airflow Kafka dbt) for diverse structured and unstructured marked data. Provide reusable SDKs in Python and Go for internal data producers.

Storage & Modeling

Implement and tune S3 columnoriented and timeseries data storage for petabytescale analytics; own partitioning compression TTL versioning and cost optimisation.

Tooling & Libraries

Develop internal libraries for schema management data contracts validation and lineage; contribute to shared libraries and services for internal data consumers for research backtesting and real-time trading purposes.

Reliability & Observability

Embed monitoring alerting SLAs SLOs and CI / CD; champion automated testing data quality dashboards and incident runbooks.

Collaboration

Partner with Data Science QuantResearch Backend and DevOps to translate requirements into platform capabilities and evangelise best practices.

Qualifications
Required Skills & Experience
  • 7 years building production-grade data systems.
  • Familiarity with market data formats (e.g. MDP ITCH FIX proprietary exchange APIs) and market data providers.
  • Expert level Python (Go and C nice to have).
  • Hands-on with modern orchestration (Airflow) and event streams (Kafka).
  • Strong SQL proficiency: aggregations joins subqueries window functions (first last candle histogram) indexes query planning and optimization.
  • Designing high-throughput APIs (REST / gRPC) and data access libraries.
  • Strong Linux fundamentals containers (Docker) and cloud object storage (AWSS3 / GCS).
  • Proven track record of mentoring code reviews and driving engineering excellence.
Additional Information
What we offer
  • Working in a modern international technology company without bureaucracy legacy systems or technical debt.
  • Excellent opportunities for professional growth and self-realization.
  • We work remotely from anywhere in the world with a flexible schedule.
  • We offer compensation for health insurance sports activities and professional training.
Remote Work

Yes

Employment Type

Full-time

Key Skills

Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala

Department / Functional Area : Data Engineering

Experience

years

Vacancy

1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.