Enable job alerts via email!

Big Data Engineer (Libra) - Data Platform

Borr Drilling

Singapore

On-site

SGD 60,000 - 90,000

Full time

9 days ago

Job summary

A leading company is seeking a Big Data Engineer for its Data Platform team in Singapore. The role focuses on managing data systems and developing ETL pipelines, requiring proficiency in big data frameworks and a relevant degree. Join a culture of creativity and continuous improvement while working with a diverse team dedicated to impactful innovation.

Qualifications

  • At least 1 year of experience in Data Engineering.
  • Experience with data warehouse implementation is required.
  • Familiarity with large-scale distributed systems is beneficial.

Responsibilities

  • Managing the data system of the experimentation platform.
  • Designing, modeling, and developing PB-level data warehouses.
  • Building ETL data pipelines and automated systems.

Skills

Proficiency with big data frameworks
Coding in Java
Coding in Scala
Coding in SQL
Coding in Python

Education

Bachelor's degree in Computer Science

Tools

Hadoop
Spark
Presto
Hive
Flink
Clickhouse

Job description

Big Data Engineer (Libra) - Data Platform

Singapore | Regular | R&D | Job ID: A220563

Responsibilities

About the team:

Libra is a large-scale online one-stop A/B testing platform developed by Data Platform. Its features include:

  • Providing experimental evaluation services for all product lines within the company, covering complex scenarios such as recommendation, algorithm, function, UI, marketing, advertising, operation, social isolation, causal inference, etc.
  • Offering services throughout the entire experimental lifecycle from design, creation, indicator calculation, statistical analysis to final evaluation and launch.
  • Supporting the company's business with rapid iterative trial and error, emphasizing bold assumptions and careful verification.

Responsibilities:

  • Managing the data system of the experimentation platform, including operation and maintenance.
  • Designing, modeling, and developing PB-level data warehouses.
  • Building ETL data pipelines and automated systems.
  • Developing an expert system for metric data processing that integrates offline and real-time processing.
Qualifications

Minimum Qualifications:

  • Bachelor's degree in Computer Science or related technical field, or equivalent practical experience.
  • Proficiency with big data frameworks such as Presto, Hive, Spark, Flink, Clickhouse, Hadoop, with experience in large-scale data processing.
  • At least 1 year of experience in Data Engineering.
  • Proficiency in coding with Java, Scala, SQL, Python, or similar languages.
  • Experience with data warehouse implementation and supporting actual business scenarios.

Preferred Qualifications:

  • Knowledge of data ingestion, modeling, processing, and persistence strategies, ETL design, job scheduling, and dimensional modeling.
  • Expertise in designing, analyzing, and troubleshooting large-scale distributed systems (Hadoop, Spark, Presto, Kafka, etc.).
  • Work or internship experience in internet companies, especially with big data processing.
Job Information

About Us:

Founded in 2012, ByteDance's mission is to inspire creativity and enrich life. With products like TikTok, Lemon8, CapCut, and platforms like Toutiao, Douyin, and Xigua, ByteDance aims to connect, entertain, and empower users worldwide.

Why Join ByteDance:

We foster creativity through innovative products that enable authentic self-expression, discovery, and connection. Our diverse teams drive meaningful impact, and we cultivate a culture of curiosity, humility, and continuous iteration. Join us to be part of a limitless journey.

Diversity & Inclusion:

ByteDance is dedicated to creating an inclusive environment that values skills, experiences, and perspectives. We celebrate diversity and aim to reflect the communities we serve through our workplace and products.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.