Enable job alerts via email!

Senior Data Platform Engineer

Alpaca

Indonesia

Remote

IDR 1.324.284.000 - 1.986.427.000

Full time

Today
Be an early applicant

Job summary

A leading financial technology company is seeking a Senior Data Engineer to design and develop the data management layer for their trading platform. This role requires over 7 years of experience in data engineering, particularly with scalable data platforms and strong proficiency in Python and SQL. The company offers competitive salaries and flexible remote working conditions.

Benefits

Competitive Salary & Stock Options
New Hire Home-Office Setup: One-time USD $500
Monthly Stipend: USD $150

Qualifications

  • 7+ years of experience in data engineering.
  • Building scalable, low-latency data platforms handling >100M events/day.
  • Strong hands-on experience with relational database systems.

Responsibilities

  • Design and oversee key ETL patterns for stakeholders.
  • Develop scalable transformation patterns for BI tools.
  • Collaborate closely with teams to address data flow needs.

Skills

Data engineering
Python
SQL
Docker
Kubernetes
ETL technologies
Streaming systems

Tools

Kafka
Airflow
Airbyte
Job description
Overview

Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series C funding round brought our total investment to over $170 million, fueling our ambitious vision.

Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totaling over 6 million brokerage accounts.

Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet. We are committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it.

Alpaca is backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator.

Our Team Members:

We are a dynamic team of 230+ globally distributed members who work from around the world, including the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond. We are looking for passionate individuals who align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact.

Your Role

We are seeking a Senior Data Engineer to design and develop the data management layer for our platform to ensure its scalability as we expand to larger customers and new jurisdictions. At Alpaca, data engineering encompasses financial transactions, customer data, API logs, system metrics, augmented data, and third-party systems that impact decision-making for both internal and external users. We process hundreds of millions of events daily, with this number growing as we onboard new customers.

We prioritize open-source solutions in our data management approach, leveraging a Google Cloud Platform (GCP) foundation for our data infrastructure. This includes batch/stream ingestion, transformation, and consumption layers for BI, internal use, and external third-party sinks. Additionally, we oversee data experimentation, cataloging, and monitoring and alerting systems.

Our team is 100% distributed and remote.

What You Will Do
  • Design and oversee key forward and reverse ETL patterns to deliver data to relevant stakeholders.
  • Develop scalable patterns in the transformation layer to ensure repeatable integrations with BI tools across various business verticals.
  • Expand and maintain the constantly evolving elements of the Alpaca Data Lakehouse architecture.
  • Collaborate closely with sales, marketing, product, and operations teams to address key data flow needs.
  • Operate the system and manage production issues in a timely manner.
Who You Are (Must-Haves)
  • 7+ years of experience in data engineering, including 2+ years of building scalable, low-latency data platforms capable of handling >100M events/day.
  • Proficiency in at least one programming language, with strong working knowledge of Python and SQL.
  • Experience with cloud-native technologies like Docker, Kubernetes, and Helm.
  • Strong hands-on experience with relational database systems.
  • Experience in building scalable transformation layers, preferably through formalized SQL models (e.g., dbt).
  • Ability to work in a fast-paced environment and adapt solutions to changing business needs.
  • Experience with ETL technologies like Airflow and Airbyte.
  • Production experience with streaming systems like Kafka.
  • Exposure to infrastructure, DevOps, and Infrastructure as Code (IaaC).
  • Deep knowledge of distributed systems, storage, transactions, and query processing.
How We Take Care of You
  • Competitive Salary & Stock Options
  • New Hire Home-Office Setup: One-time USD $500
  • Monthly Stipend: USD $150 per month via a Brex Card

We are proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.

Apply for this job

To apply, please submit your résumé and the required information through the application process.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.