Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer, Big Data (18-months contract)

OPTIMUM SOLUTIONS (SINGAPORE) PTE LTD

Singapore

On-site

SGD 80,000 - 110,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions provider in Singapore is seeking a Data Engineer for an 18-month contract. The role involves designing and maintaining scalable data pipelines, collaborating with stakeholders, and ensuring data quality. Ideal candidates will have over 5 years of experience, a solid grasp of big data technologies, and be proficient in SQL. Knowledge of cloud platforms and BI tools are essential. This position is crucial for supporting business decision-making through effective data management.

Qualifications

  • 5+ years of experience in data engineering or related roles.
  • Solid understanding of data warehousing, ETL design patterns, and data modeling.
  • Hands on experience with SQL.

Responsibilities

  • Design and maintain ETL pipelines and data models for large-scale datasets.
  • Analyze complex data and deliver insights through BI dashboards and reports.
  • Collaborate with stakeholders to translate business requirements into technical solutions.
  • Build and maintain CI/CD pipelines for data platforms.
  • Ensure data quality, security, and compliance.
  • Troubleshoot data issues and optimize pipeline performance.

Skills

Big data technologies (e.g. Hadoop, Spark, Hive)
SQL
Python and/or Java
Data warehousing
ETL design patterns
BI tools (Tableau, Power BI, Looker)

Tools

GCP
BigQuery
AWS
Azure
Job description

This is an 18-months contract.

As a Data Engineer, you will design, build, and maintain scalable data pipelines and analytics solutions to support business decision-making. You will work closely with business and technical stakeholders to translate data requirements into robust data models, ETL pipelines, and BI dashboards, while ensuring data quality, security, and governance.

Responsibilities
  • Design and maintain ETL pipelines and data models for large-scale datasets
  • Analyze complex data and deliver insights through BI dashboards and reports
  • Collaborate with stakeholders to translate business requirements into technical solutions
  • Build and maintain CI/CD pipelines for data platforms
  • Ensure data quality, security, and compliance
  • Troubleshoot data issues and optimize pipeline performance
Requirements
  • Experience with big data technologies (e.g. Hadoop, Spark, Hive)
  • Hands on experience with SQL
  • 5+ years of experience in data engineering or related roles
  • Experience with Python and/or Java
  • Solid understanding of data warehousing, ETL design patterns, and data modeling
  • Experience with cloud data platforms (e.g. GCP, BigQuery, AWS, Azure)
  • Familiarity with BI tools (Tableau, Power BI, Looker)
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.