Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer at Buona Vista / 2 days WFH

ETHOS SEARCH ASSOCIATES PTE. LTD.

Singapore

On-site

SGD 90,000 - 120,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading organization in data and analytics seeks a Data Engineer to design, build, and maintain enterprise data architecture in Singapore. The ideal candidate has over 8 years of experience and is proficient in Snowflake and Azure Data Factory. Responsibilities include designing data pipelines, optimizing performance, and collaborating across teams to implement data-driven solutions. This position offers a dynamic work environment aimed at enhancing data reliability and scalability.

Qualifications

  • At least 8–10 years of experience in data engineering or platform development roles.
  • Hands-on experience with Snowflake, Azure Data Factory, and metadata catalogues.
  • Strong knowledge of data modelling, cloud-warehouse architecture, and CI/CD automation.

Responsibilities

  • Design and implement robust data pipelines using Azure Data Factory.
  • Build and optimise data marts to support enterprise dashboards.
  • Administer Snowflake accounts and integrate metadata for governance.

Skills

SQL
Python
Data Orchestration Tools
Collaboration

Education

Bachelor’s degree in Computer Science, Data Engineering or related discipline

Tools

Snowflake
Azure Data Factory
Airflow
dbt
Alation
Collibra
Job description
About the Job

The Department helps the Organisation make better decisions, work efficiently, and innovate confidently by improving data use, developing business intelligence, and enabling responsible AI adoption.

The Data Engineer designs, builds, and maintains the organisation’s enterprise data architecture.

This role ensures that data across HQ and Research Institutes is reliable, secure, and analytics ready. The Data Engineer also drives standards, performance optimisation, and automation for scalable analytics and AI delivery.

Key Responsibilities
A. Data Architecture & Engineering
  • Design and implement robust data pipelines using Azure Data Factory (ADF) and ELT/ETL best practices.
  • Build and optimise data marts to support enterprise dashboards and analytics tools (e.g., Power BI).
  • Optimise Snowflake environments for performance, cost, and scalability.
  • Partner with IT Shared Services to ensure compliance and enterprise security standards.
B. Platform Administration & Automation
  • Administer Snowflake accounts, roles, warehouses, and data-sharing configurations.
  • Integrate metadata and lineage flows into Alation to enhance discoverability and governance.
  • Implement automation pipelines and CI/CD practices for ingestion, validation, and deployment.
C. Data Quality & Reliability
  • Develop and maintain validation checks, profiling, and error-handling logic within pipelines.
  • Collaborate with Data Analysts and Scientists to ensure datasets meet analytical and AI model requirements.
D. Collaboration & Mentorship
  • Partner with Business Analysts and the AI team to operationalise dashboards and machine-learning outputs.
  • Mentor junior engineers in coding, documentation, and data-architecture best practices.
Qualifications & Experience
  • Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related discipline.
  • At least 8–10 years of experience in data engineering or platform development roles.
  • Hands‑on experience with Snowflake, Azure Data Factory, and metadata catalogues (e.g., Alation, Collibra).
  • Proficient in SQL, Python, and data‑orchestration tools (e.g., Airflow, dbt).
  • Strong knowledge of data modelling, cloud‑warehouse architecture, and CI/CD automation.
  • Familiar with public‑sector data governance and compliance.
  • Demonstrated ability to collaborate across analytics, AI, and IT teams.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.