Enable job alerts via email!

Senior Data Engineer

BHFT

United Arab Emirates

Remote

AED 120,000 - 200,000

Full time

21 days ago

Job summary

A modern international technology company is seeking a Data Engineer to architect batchstream pipelines and implement data storage solutions. The ideal candidate has over 6 years of experience building production-grade data systems and is proficient in Python, SQL, and orchestration tools like Airflow. This full-time role allows for remote work and offers professional growth opportunities without the burden of legacy systems.

Benefits

Flexible schedule
Health insurance compensation
Support for sports activities
Professional training compensation

Qualifications

  • 6 years of experience in building and maintaining production-grade data systems.
  • Expert-level Python development skills.
  • Hands-on experience with Airflow and Kafka.
  • Advanced SQL skills for complex aggregations and query optimization.
  • Experience in designing high-throughput APIs (REST/gRPC).
  • Solid fundamentals in Linux and Docker.

Responsibilities

  • Architect batchstream pipelines for diverse data.
  • Implement S3 data storage for analytics.
  • Develop internal libraries for data management.
  • Embed monitoring and alerting in data systems.
  • Collaborate with various teams to enhance platform capabilities.

Skills

Python
Data Engineering
SQL
Airflow
Kafka
APIs
Docker

Tools

AWS S3
GCS
Job description

Key Responsibilities

  • Ingestion&Pipelines: Architect batchstream pipelines (Airflow Kafka dbt) for diverse structured and unstructured marked data. Provide reusable SDKs in Python and Go for internal data producers.

  • Storage&Modeling: Implement and tune S3 columnoriented and timeseries data storage for petabytescale analytics; own partitioning compression TTL versioning and cost optimisation.

  • Tooling & Libraries: Develop internal libraries for schema management data contracts validation and lineage; contribute to shared libraries and services for internal data consumers for research backtesting and real-time trading purposes.

  • Reliability & Observability: Embed monitoring alerting SLAs SLOs and CI/CD; champion automated testing data quality dashboards and incident runbooks.

  • Collaboration: Partner with Data Science QuantResearch Backend and DevOps to translate requirements into platform capabilities and evangelise best practices.


Qualifications :

  • 6 years of experience building and maintaining production-grade data systems with proven expertise in architecting and launching data lakes from scratch.
  • Expert-level Python development skills (Go and C nice to have).
  • Hands-on experience with modern orchestration tools (Airflow) and streaming platforms (Kafka).
  • Advanced SQL skills including complex aggregations window functions query optimization and indexing.
  • Experience designing high-throughput APIs (REST/gRPC) and data access libraries.
  • Solid fundamentals in Linux containerization (Docker) and cloud object storage solutions (AWS S3 GCS).
  • Strong knowledge of handling diverse data formats including structured and unstructured data with experience optimizing storage strategies such as partitioning compression and cost management.
  • English at C1 level - confident communication documentation and collaboration within an international team.

Additional Information :

What we offer:

  • Working in a modern international technology company without bureaucracy legacy systems or technical debt.
  • Excellent opportunities for professional growth and self-realization.
  • We work remotely from anywhere in the world with a flexible schedule.
  • We offer compensation for health insurance sports activities and professional training.

Remote Work :

Yes


Employment Type :

Full-time

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.