Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer (Integrations)

YINSON RENEWABLES AS

Singapore

On-site

SGD 70,000 - 90,000

Full time

12 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading energy solutions company based in Singapore is seeking a Data Engineer (Integrations) to design and maintain data integration pipelines. The ideal candidate should possess a Bachelor’s degree in Computer Science or related field and have at least 3 years of experience in data engineering. Responsibilities include developing ETL architectures, ensuring data quality, and coordinating integration requirements. Join us in delivering powerful solutions and making a difference!

Qualifications

  • Minimum 3+ years’ experience in data engineering or system integrations.
  • Experience designing and maintaining data processing systems.
  • Strong proficiency in SQL-based platforms.

Responsibilities

  • Design and maintain data integration pipelines connecting various systems.
  • Develop scalable ETL/ELT architectures for analytics.
  • Ensure data quality and reliability across multi-cloud environments.

Skills

SQL proficiency
Python or Java for data processing
Analytical skills
Problem-solving skills
Communication skills
Ability to learn new technologies

Education

Bachelor’s degree in Computer Science, Computer Engineering, Information Technology
Job description

Legal Entity: Yinson Production Offshore Pte Ltd

Job Function: Database & Business and Data Analytics

Employment Type: Permanent

Yinson is a dynamic, equal opportunity employer with great organisational culture where people are valued and empowered to deliverpowerful solutions.

Job Summary

The Data Engineer (Integrations) is responsible for:

  • Designing, building, and maintaining robust data integration pipelines that connect operational, engineering, and enterprise systems.
  • Developing scalable ETL / ELT architectures that transform raw data into reliable, machine-readable formats for analytics, BI, and digital solutions.
  • Ensuring data quality, reliability, security, and interoperability across multi-cloud and hybrid environments.
  • Supporting asset lifecycle data requirements across Business Development, Project, and Operations phases.
Key Responsibilities
  • Design, develop, and maintain automated data pipelines for ingesting, transforming, and contextualising data from multiple internal and external sources.
  • Build, operate and maintain system-to-system data integrations, including SQL ↔ JSON data transformations, via APIs, and across on-prem, cloud and SaaS systems of record.
  • Design, develop, test and implement data architectures to support analytics, modelling, digital workflows, and AI-enabled use cases including knowledge-retrieval pipelines for RAG.
  • Ensure data quality, validation, and reliability metrics are defined, monitored, and enforced. Support real-time and batch data acquisition systems, ensuring availability, performance, and production scalability.
  • Prepare and structure data for descriptive, predictive, and prescriptive analytics. Support the development and maintenance of medallion architecture data design patterns.
  • Coordinate integration requirements during Business Development and Project phases, ensuring downstream analytics readiness.
  • Enable secure access to data for dynamic and steady-state simulations, including for Operator Training Simulators (OTS), through to operational handover.
  • Work closely with Data Analysts and Digital Solutions Analysts to ensure downstream dashboards, AI solutions, and workflows are built on reliable data pipeline foundations.
  • Plan and document technical specifications, integration designs, and data flows. Identify integration risks, performance bottlenecks, and data gaps, and propose mitigation strategies.
  • Contribute recommendations on process improvements, architecture enhancements, and new technologies.
  • Monitor emerging risks and support prevention, mitigation, and management initiatives.
Requirements
  • Bachelor’s degree in Computer Science, Computer Engineering, Information Technology, or related discipline.
  • Minimum 3+ years’ experience in data engineering, system integrations, or data platform development.
  • Strong proficiency in SQL-based data platforms.
  • Proficiency in Python or Java for data processing and automation.
  • Experience designing, building, and maintaining data processing and integration systems.
  • Strong analytical, problem-solving, and communication skills.
  • Ability to learn new technologies quickly and apply them effectively

If you want to team up with us as we dream and stride towards a better tomorrow – we would love to hear from you!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.