Job Search and Career Advice Platform

Enable job alerts via email!

Senior/Staff Engineer, Data Warehouse

OKX

Singapore

On-site

SGD 70,000 - 90,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading digital asset exchange in Singapore is looking for a Data Engineer to design and build data pipelines. The ideal candidate will have strong expertise in data processing tools like Spark and experience with both batch and streaming data pipelines. Responsibilities include architecting cloud-based data infrastructure and collaborating with diverse teams to improve data-driven platforms and solutions. Candidates should be fluent in English and have a Bachelor's in Computer Science or equivalent experience.

Qualifications

  • Bachelor’s Degree in Computer Science or related field.
  • Solid experience with data processing tools such as Spark and Flink.
  • Experience implementing both batch and streaming data pipelines.

Responsibilities

  • Design and build resilient data pipelines for batch and real-time data.
  • Architect data infrastructure using cloud tools.
  • Collaborate with product managers and engineers to build data-driven platforms.

Skills

Data processing tools (e.g., Spark, Flink)
Python/Go/Scala/Java
SQL and NoSQL databases
DevOps tools (Git, Docker, k8s)
Cloud services (AWS, Ali Cloud, GCP, Azure)
Data analysis

Education

Bachelor’s Degree in Computer Science or equivalent experience

Tools

Git
Docker
k8s
Amplitude/Tableau/QlikView
Hadoop
Job description

OKX will be prioritising applicants who have a current right to work in Singapore, and do not require OKX's sponsorship of a visa.

Who We Are

At OKX, we believe the future will be reshaped by technology. Founded in 2017, we are revolutionising world systems through our cutting‑edge digital asset exchange, Web3 portal and blockchain ecosystems. We reshape the financial ecosystem by offering some of the most diverse and sophisticated products, solutions, and trading tools on the market. Trusted by more than 50 million users in over 180 countries globally, OKX empowers every individual to explore the world of Web3. With our extensive range of products and services, and unwavering commitment to innovation, OKX envisions a world of financial access backed by blockchain and the power of decentralized finance.

We are innovative in the way we think, work, and in the products we create. We are also socially responsible by actively participating and encouraging employees to take part in various public welfare activities. With more than 3,000 employees around the world, we believe embracing diversity and inclusion will spark the creation of long‑term value for the industry. Come Build the Future with Us now!

About the team:

OKX data team is responsible for the whole data scope of OKG, from technical selection, architecture design, data ingestion, data storage, ETL, data visualization to business intelligence and data science. We are data engineers, data analysts and data scientists. The team has end‑to‑end ownership of most of the data at OKX throughout the whole data lifecycle including data ingestion, data ETL, data warehouse and data services. As a data engineer of the team, you will work with the team to leverage data technologies to empower evidence‑based decision‑making and improve the quality of the company's products and services.

Responsibilities:

  • Design and build resilient and efficient data pipelines for both batch and real‑time streaming data
  • Architect and design data infrastructure on cloud using industry standard tools
  • Execute projects with an Agile mindset
  • Build software frameworks to solve data problems at scale
  • Collaborate with product managers, software engineers, data analysts and data scientists to build scalable and data‑driven platforms and tools
  • Ensure data integrity and scalability through enforcement of data standards. Improve data validation and monitoring processes to proactively prevent issues and quickly identify issues. Drive resolution on the issues.
  • Define, understand, and test external/internal opportunities to improve our products and services.

Requirements:

  • Bachelor’s Degree in Computer Science or have equivalent professional experience
  • Solid Experience with data processing tools such as Spark, Flink
  • Solid Experience implementing batch and streaming data pipelines
  • Solid experiences in Python/Go/Scala/Java.
  • In‑depth knowledge of both SQL and NoSQL databases, including performance tuning and troubleshooting
  • Familiar with DevOps tools such as Git, Docker, k8s
  • Experience with the cloud (e.g. AWS, Ali Cloud, GCP, Azure)
  • Be proficient in SQL, familiar with advanced SQL features such as window functions, aggregate functions and creating scalar functions/user‑defined functions.
  • Proven successful and trackable experience in full end‑to‑end data solutions involving data ingestion, data persistence, data extraction and data analysis.
  • Self-driven, innovative, collaborative, with good communication and presentation skills
  • Fluent in English, both written and spoken.

Preferred Qualifications:

  • Experience in FinTech, eCommerce, SaaS, AdTech, or Digital Wallet business industries.
  • Experience in working with teams across offices and timezones is a plus.
  • Experience in big data tools such as Amplitude/Tableau/QlikView, Ali Cloud DataWorks, MaxCompute, Hadoop, Hive, Spark and HBase is a big plus.

More that we love to tell you along the process!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.