Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

PARAMETA SOLUTIONS (SINGAPORE) PTE. LIMITED

Singapore

On-site

SGD 80,000 - 110,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data & analytics firm in Singapore is seeking a skilled Data Engineer to design, build, and maintain scalable data systems. The ideal candidate will have strong experience with Python and AWS, alongside expertise in data pipeline architectures. Join a dynamic team to deliver innovative data solutions and contribute to a high-performing engineering function. This position offers competitive benefits and opportunities for professional growth.

Qualifications

  • Demonstrated, professional experience in data engineering or a closely related role.
  • Experience building and scaling API‑driven data platforms.
  • Excellent communication skills, with the ability to collaborate across technical and business teams.

Responsibilities

  • Design, build, and maintain performant batch and streaming data pipelines.
  • Develop scalable data warehousing solutions using Snowflake.
  • Architect and manage cloud‑based infrastructure to ensure resilience and efficiency.

Skills

Data engineering
Python
SQL
AWS
Apache Kafka
Apache Airflow
Snowflake
Kubernetes
ETL processes
Linux

Education

Bachelor’s degree in Computer Science, Engineering, Mathematics, or related technical field

Tools

Prometheus
Grafana
CloudWatch
FastAPI
Dask
PySpark
Job description

The TP ICAP Group is a world leading provider of market infrastructure.

Our purpose is to provide clients with access to global financial and commodities markets, improving price discovery, liquidity, and distribution of data, through responsible and innovative solutions.

Through our people and technology, we connect clients to superior liquidity and data solutions.

The Group is home to a stable of premium brands. Collectively, TP ICAP is the largest interdealer broker in the world by revenue, the number one Energy & Commodities broker in the world, the world’s leading provider of OTC data, and an award winning all-to-all trading platform.

Founded in London in 1866, the Group operates from more than 60 offices in 27 countries. We are 5,200 people strong. We work as one to achieve our vision of being the world’s most trusted, innovative, liquidity and data solutions specialist.

About Parameta Solutions

Parameta Solutions is the Data & Analytics division of TP ICAP Group. The business provides clients with unbiased OTC content and proprietary data, in-depth insights across price discovery, risk management, benchmark and indices, and pre and post-trade analytics. Its post-trade solutions offering helps market participants control their counterparty and regulatory risks through a growing range of tools that manage balance-sheet exposure, as well as compression and optimisation services. The Data & Analytics division includes the following brands: Tullett Prebon Information, PVM Data Services, ICAP Information and Burton-Taylor Consulting.

Role Overview

Parameta Solutions is seeking a skilled Data Engineer to join our growing global team. Based in Singapore, you will be a foundational member of our new hub, helping to shape a high‑performing engineering function. In this role, you will design, build, and maintain scalable systems and infrastructure to process and analyse complex, high‑volume datasets. Working closely with data scientists, analysts, product managers, and other stakeholders, you will translate business needs into robust technical solutions that power innovative data products and services.

Key Responsibilities
  • Design, build, and maintain performant batch and streaming data pipelines to support both real‑time market data and large‑scale batch processing (Apache Airflow, Apache Kafka, Apache Flink).
  • Develop scalable data warehousing solutions using Snowflake and other modern platforms.
  • Architect and manage cloud‑based infrastructure (AWS or GCP) to ensure resilience, scalability, and efficiency.
  • Enhance and optimise CI/CD pipelines (Jenkins, GitLab) to streamline development, testing, and deployment.
  • Monitor and maintain the health and performance of data applications, pipelines, and databases using observability tools (Prometheus, Grafana, CloudWatch).
  • Partner with stakeholders across functions to deliver reliable data services and enable analytical product development.
  • Contribute actively in agile ceremonies (stand‑ups, sprint planning, retrospectives) to align on priorities and delivery.
Experience & Competencies

Essential

  • Demonstrated, professional experience in data engineering or a closely related role.
  • Proficiency in Python (with frameworks such as Pandas, Dask, PySpark) and strong SQL skills.
  • Experience building and scaling API‑driven data platforms (e.g., FastAPI).
  • Hands‑on experience with AWS, Snowflake, Kubernetes, and Airflow.
  • Knowledge of monitoring and alerting systems (Prometheus, Grafana, CloudWatch).
  • Strong understanding of ETL processes and event streaming (Kafka, Flink, etc.).
  • Comfortable with Linux and command‑line operations.
  • Excellent communication skills, with the ability to collaborate across technical and business teams.
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or related technical field.

Desired

  • Exposure to financial market data or capital markets environments.
  • Knowledge of additional programming languages such as Java, C#, or C++.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.