Enable job alerts via email!

Data Engineer

EXODUSPOINT CAPITAL MANAGEMENT SINGAPORE, PTE. LTD.

Singapore

On-site

SGD 60,000 - 90,000

Full time

3 days ago
Be an early applicant

Job summary

A global investment management firm in Singapore is seeking an experienced Data Engineer to join its data engineering team. The role involves building enterprise data assets with financial datasets and collaborating across teams. Candidates should have a STEM degree and over 3 years of programming experience in Python and/or Java, along with knowledge of data modeling and AWS technologies.

Qualifications

  • 3+ years of experience in programming with Python and/or Java.
  • Experience working with financial datasets and enterprise financial vendor data products.
  • Familiar with time-series database technologies.

Responsibilities

  • Build data sets using Python, SQL, Snowflake, Kafka, AWS, and other technologies.
  • Engage with vendors to create valuable data assets.
  • Collaborate with core engineering to process and distribute data.

Skills

Python
SQL
Data modeling
Financial data knowledge
AWS
Apache Kafka

Education

Bachelor’s degree in STEM

Tools

Snowflake
API technologies
JIRA
Job description

ExodusPoint Capital, founded in 2017 by Michael Gelband, began managing investor capital in 2018. The firm employs a global multi-strategy investment approach, seeking to deliver compelling asymmetric returns by combining complementary liquid strategies managed by experienced investment professionals within a robust risk framework. ExodusPoint brings together an accomplished team with hands-on experience running multi-manager businesses to create an institutional investment management firm.

ExodusPoint is seeking an experienced Data Engineer with financial data knowledge to join an established data engineering team to help build a next generation data offering for ExodusPoint’s diverse investment teams, varying across asset classes and strategies. The ideal candidate is a domain expert on financial datasets who has technical skills to work with business and create enterprise data assets utilizing advanced data processing techniques and tools.

The Enterprise Data group is focused on building a robust data platform which services a diverse set of investment teams and internal clients. The group consists of data sourcing experts, data product specialists, data scientists and data engineers, who are responsible for the discovery, management, and curation of thousands of alpha sources for the firm and our investment professionals.

Responsibilities
  • Build data sets using Python, SQL, Snowflake, Kafka, AWS, and other related technologies.
  • Understand financial reference data sets from Bloomberg and/or Refinitiv.
  • Engage with vendors and technical teams to systematically ingest, evaluate, and create valuable data assets
  • Engage with technical and non-technical clients as SME on data asset offerings
  • Collaborate with core engineering team to create central capabilities to process, manage and distribute data assts at scale
  • Apply robust data quality rules to systemically qualify data deliveries and guarantee the integrity of financial datasets
  • Investigate and remediate domain-specific production issues escalated by the operations teams
  • Enrich the central data catalog with advanced data profiling visualizations to enable discovery and evaluation
  • Build up internal documentation and sample uses cases of the data sets
  • Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures.
Qualifications
  • Bachelor’s degree in STEM
  • +3 years of experience with programming in Python and/or Java
  • Experience working with security master, financial datasets and/or enterprise financial vendor data products
  • Familiar with SQL and/or time-series database technologies
  • Experience with data modeling, data warehousing, and building data pipelines
  • Experience working with FTP, API, S3 and other distribution channels to source data
  • Experience working on multiple projects and with different stakeholders
Desired Qualifications
  • Hands-on experience with AWS native data and compute technologies (S3, Lambda, Glue, DataSync, EMR, Athena, Lake Formation, Kinesis, etc.).
  • Experience designing and working with APIs (REST, GraphQL, etc.).
  • Experience with Apache Kafka or other data streaming technologies.
  • Knowledge of developing containerized applications.
  • Experience with JIRA and Agile project management.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.