Activez les alertes d’offres d’emploi par e-mail !

06. Senior Data Engineer - Quantitative Hedge Fund

Abc Arbitrage

Paris

Hybride

EUR 60 000 - 80 000

Plein temps

Il y a 12 jours

Résumé du poste

A leading finance technology firm in Paris is seeking a Senior Data Engineer to join their Data Engineering team. This role will involve building scalable data platforms, handling large datasets, and delivering analysis-ready data for multiple teams. Candidates should have strong skills in Python, SQL, and Apache Spark, with at least 5 years of experience in a similar environment. This position offers an attractive compensation package and a hybrid remote work model.

Prestations

Attractive compensation package
Profit-sharing based on results
Great Place To Work certified

Qualifications

  • 5+ years of experience in a data engineering or backend software role, ideally in a financial or high-frequency trading environment.
  • Strong programming skills in Python (including use of Pandas, PySpark).
  • Expertise in SQL, Apache Spark, Airflow, AWS S3, Athena, and Parquet file formats.
  • Proven experience managing and transforming high-volume time series data.
  • Deep understanding of data modeling, query optimization, and big data performance tuning.

Responsabilités

  • Build and evolve scalable, resilient pipelines for ingesting, processing, and structuring large volumes of data.
  • Manage high-frequency, high-volume datasets with efficient formats and partitioning schemes.
  • Deliver clean, reliable, analysis-ready datasets for multiple teams.
  • Develop user-friendly dashboards and tools to expose data insights.
  • Optimize data models, queries, and processing workflows.

Connaissances

Python
SQL
Apache Spark
Airflow
AWS S3
Data modeling
Query optimization
Cloud infrastructure

Outils

Apache Airflow
Apache Spark
AWS Athena
Parquet
Description du poste
Overview

We are seeking a Senior Data Engineer to join our Data Engineering team. This role is central to shaping and evolving the data platform that fuels our quantitative research and trading. You will work directly with large-scale financial and time series datasets, designing solutions that make data accessible, reliable, and analysis-ready for quants, researchers, and traders. Your mission is not only to build pipelines, but also to unlock insights from data, ensuring our teams can innovate and move quickly with the right information at their fingertips.

Key Responsibilities
  • Data platform design & development: Build and evolve scalable, resilient pipelines for ingesting, processing, and structuring large volumes of financial and market data.
  • Time series & tick data handling: Manage high-frequency, high-volume datasets with efficient formats (Parquet) and partitioning schemes.
  • Data accessibility: Deliver clean, reliable, analysis-ready datasets for quants, researchers, and traders.
  • Data products: Develop lightweight, user-friendly dashboards and tools (e.g., Plotly Dash) to expose data insights.
  • Data quality & performance: Optimize data models, queries, and processing workflows to ensure reliability, speed, and accuracy.
Technology Stack
  • ETL/ELT workflows: Apache Airflow, Python
  • Distributed data processing: Apache Spark (performance optimization on large datasets)
  • Data storage & query optimization: AWS Athena, S3, related AWS cloud services
About you

Required Qualifications

  • 5+ years of experience in a data engineering or backend software role, ideally in a financial or high-frequency trading environment.
  • Strong programming skills in Python (including use of Pandas, PySpark).
  • Expertise in SQL, Apache Spark, Airflow, AWS S3, Athena, and Parquet file formats.
  • Proven experience managing and transforming high-volume time series data.
  • Deep understanding of data modeling, query optimization, and big data performance tuning.
  • Experience in cloud infrastructure.
Preferred Qualifications
  • Familiarity with financial instruments, trading systems, and market microstructure.
  • Experience integrating market data from vendors (e.g., Bloomberg, Refinitiv, BMLL…).
  • Knowledge of DevOps practices and infrastructure-as-code (e.g., Terraform, Docker).
  • Proficiency in distributed computing, and parallel processing techniques.

This offer describes the ideal profile. If you don\'t have all the skills listed but think you\'re up to the challenge, don\'t hesitate to apply!

Information
  • Permanent contract available as soon as possible or depending on constraints
  • Attractive compensation package including profit-sharing based on operating results
  • Position based in Paris, Opéra-Bourse district
  • Hybrid remote work

ABC arbitrage is officially certified as a Great Place To Work 2023, and we are ranked among the Top 30 companies in our category!
99% of our employees say that ABC arbitrage is truly a great place to work!Our commitment to diversity and inclusion is a priority. We strive to create an inclusive working environment, conducive to the fulfillment of each individual, while ensuring a harmonious balance between professional and personal life. Our policies in favor of gender equality and people with disabilities are at the heart of our approach.

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.