Job Search and Career Advice Platform

Enable job alerts via email!

Lead Data Engineer

Alpheya

United Arab Emirates

On-site

AED 250,000 - 350,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading B2B WealthTech startup in the UAE seeks a skilled Lead Data Engineer to manage data pipelines and APIs. You will lead a team to enhance our data platform, ensuring seamless flow of market data while developing production-ready solutions. Ideal candidates will have 5+ years in production Python backends, experience with ETL workflows, and strong Python mastery. Join us to power and grow our clients’ wealth franchises with innovative data technology.

Qualifications

  • 5+ years building production Python backends for data-intensive applications.
  • Experience designing batch or streaming pipelines with tools like Kafka or RabbitMQ.
  • Strong knowledge of Postgres including query tuning and migrations.

Responsibilities

  • Lead a team of 2-3 Data Engineers to deliver production-ready ETL pipelines.
  • Build and maintain high-throughput ETL pipelines that ingest real-time market data.
  • Work closely with data scientists and product managers to translate data requirements.

Skills

Production Python backends
ETL workflows
Postgres
Async tools
REST APIs
Clear communication

Education

Master's degree in Computer Science or related field

Tools

FastAPI
Docker
Kubernetes
Terraform
Job description

About Alpheya: We are a B2B WealthTech startup based in Abu Dhabi and backed by BNY Mellon (America’s oldest bank and first company to list on NYSE) and Lunate (a new $50B AUM alternative asset management firm based in Abu Dhabi, UAE). The company has raised $300M to build a state‑of‑the‑art wealth technology platform. Our mission is to power and grow our clients’ wealth franchises through differentiated experiences, financial solutions, and insights. Our digital wealth‑management platform will enable banks and other financial institutions in the Middle East to grow and further penetrate affluent, HNW, and UHNW investor segments. While still leveraging the capabilities and knowledge of large organizations, our fintech is a startup with truly cross‑functional and agile teams.

For more information, please visit www.alpheya.com.

Role Description and Responsibilities

We’re looking for a skilled Lead Data Engineer to contribute to the Python services that power our data platform. You’ll help build and maintain high‑throughput ETL pipelines that ingest real‑time market data and expose it through secure CRUD APIs. Working alongside other backend and data engineers, you’ll make sure data flows smoothly from external sources into our internal data platform.

Responsibilities
  • Lead a team of 2‑3 Data Engineers to deliver production‑ready ETL pipelines and APIs exposing RAG and Agents capabilities.
  • Implement, test, and improve scalable ETL workflows that handle large volumes of data.
  • Develop and maintain task queues with async tools, helping the team meet SLAs and surface operational metrics.
  • Build REST endpoints with FastAPI Framework, providing clean CRUD access to market data and configuration entities.
  • Add structured logs, traces, and metrics; contribute alerts and dashboards (Prometheus/Grafana, OpenTelemetry) that keep services at 99.9%+ availability.
  • Write type‑safe, well‑tested code, participate in code reviews, and help maintain CI/CD workflows.
  • Work closely with data scientists and product managers to translate data requirements into robust backend capabilities.
Qualifications
  • Education – Master's degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience).
  • Experience – 5+ years building production Python backends for data‑intensive applications.
  • Python Mastery – Modern Python (3.10+) with asyncio, typing, dependency injection, and packaging best practices.
  • ETL Skills – Experience designing batch or streaming pipelines; comfort with Kafka, RabbitMQ, or similar tools.
  • Database Skills – Strong Postgres knowledge: migrations (Alembic, Django ORM), query tuning, partitioning, backup/restore.
  • Async Orchestration – Hands‑on with async tools like Airflow, Dagster, or Celery for scheduling and monitoring jobs.
  • API Engineering – Proven ability to implement secure, versioned REST/GraphQL APIs with auth, rate‑limiting, and RBAC.
  • Testing & DevOps – Familiarity with Pytest, Docker, Kubernetes, Terraform, and GitHub Actions (or similar).
  • Soft Skills – Clear written and verbal communication; collaborative mindset.
Nice‑to‑Haves
  • Time‑series or columnar DBs (TimescaleDB, ClickHouse, InfluxDB).
  • Knowledge of financial market‑data protocols (FIX, FAST) or regulatory feeds.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.