Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

5G-Starlink Pte.

Malacca City

On-site

MYR 100,000 - 150,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company in Malaysia is seeking a detail-oriented Senior Data Engineer to define and manage data pipelines. The role demands over 5 years of experience and expertise in technologies like Google Cloud Platform and PostgreSQL. Key responsibilities include ensuring data integrity, designing secure data architectures, and developing effective ETL pipelines. This position offers a dynamic work environment with opportunities for growth and competitive salary.

Benefits

Competitive salary
Annual performance bonus
Retirement plan
Gym and recreational facilities
Overseas travel opportunities

Qualifications

  • Minimum of 5 years of experience in the data engineering field.
  • Expertise in data modelling techniques like Kimball star schema.
  • Proficiency in Python and PostgreSQL.

Responsibilities

  • Ensure data integrity while managing complex data sources.
  • Design and build secure, high-performance data warehouses.
  • Develop, test, and maintain data processing architectures.

Skills

Data modelling techniques
Python
Relational SQL
NoSQL databases
ETL/ELT data pipelines
Google Cloud Platform
Analytical skills

Tools

Airflow
PostgreSQL
Docker
Job description

We are the Business Intelligence team. We lead the organisation in cultivating a data-driven culture as our company moves toward the future. We collect meaningful data and analytics for us to deeply understand our consumers and build more valuable products and services. What we do is incredibly important in driving smart marketing decisions, optimising our business, and increasing profitability.

Your role

At Deriv, we are looking for a detail-oriented Senior Data Engineer with the capability to define the data pipeline of our future data models. You will leverage technologies such as Google Cloud Platform, Airflow, Python, Docker, and PostgreSQL to help provide the company with dependable business intelligence solutions. As part of this role, you will develop, test, and maintain architectures for data processing and build Extract, Transform, and Load (ETL) pipelines. You will be a key contributor to making our data warehouse trustworthy by ensuring data accuracy.

  • Ensure data integrity while extracting data from in-house and third-party complex sources and manage its systematic storage. Responsible for data security, accuracy, and accessibility
  • Provide tangible business solutions and decisions using your expertise in the data engineering domain
  • Design and build high-performance, secure, and scalable company data warehouse and pipeline to support data science projects following best practices
  • Debug and resolve complex issues, and recommend improvements to ensure a well-functioning ETL pipelined architecture
  • Transform raw data into easy-to-use tables for the Data Analysts
  • Keep up-to-date on company products and new releases to efficiently plan changes in our data warehouse or pipelines
What you have
  • A minimum of 5 years of experience in the data engineering field
  • Expertise in data modelling techniques such as Kimball star schema, Anchor modelling, and Data vault
  • Competence in object-oriented or object function scripting languages such as Python
  • Proficiency in relational SQL and NoSQL databases, preferably with PostgreSQL, PITR, Pg_basebackup, WAL archival, and Replication
  • Familiarity with column-oriented storage or data warehouses such as parquet, Redshift, and Bigquery
  • In-depth skills in developing and maintaining ETL/ELT data pipelines and workflow management tools such as Airflow
  • Hands-on experience with Google Cloud Platform (GCP) services such as BigQuery, scheduled queries, Cloud Storage and Cloud Functions
  • Familiar with alerting and self-recovery methods concerning data accuracy
  • Analytical skills with the ability to transform data into optimal business decisions
  • Expertise in peer reviewing pipeline codes and suggesting improvements when required
  • Experience in helping teams make informed business decisions with data
  • Strong communication and presentation skills
  • Fluency in spoken and written English
What's good to have
  • A good background in cybersecurity and data protection
  • Proficiency in using data pipeline and workflow management tools such as Luigi
  • Exposure to maintaining and monitoring database health and resolving errors
  • Experience in managing stakeholders expectations and technical requirement gathering
  • Familiarity with container technologies such as Docker
What well give you
  • The best workplace you can possibly imagine, a gorgeous 5-storey building including a rooftop garden, a gym, squash court, yoga room, barbecue pit, jam studio, and a lot more!
  • A chance to work with top talent from across the globe (55+ nationalities)
  • Ample team-building and bonding activities
  • Great overseas travel opportunities
  • Competitive salary, annual performance bonus, and retirement plan
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.