Enable job alerts via email!

Senior Data Engineer

FLARE CONSULTING PTE. LTD.

Singapore

On-site

SGD 70,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A technology consulting firm is seeking a Senior Data Engineer to collaborate with government agencies, enhancing development efficiency. Responsibilities include building data pipelines and ensuring data quality. Candidates should have strong experience with cloud platforms and data engineering. This role offers a one-year contract, extendable, based in Singapore.

Qualifications

  • Proven experience with cloud platforms: AWS, Azure, or GCP.
  • Strong knowledge of data engineering and pipeline development.
  • Hands-on experience with Databricks.

Responsibilities

  • Collaborate with agency IT teams to ensure alignment on tech stack.
  • Translate business data requirements into technical specifications.
  • Design and build data ingestion pipelines to harmonize data.

Skills

Cloud platforms (AWS, Azure, GCP)
Data engineering and pipeline development
Databricks
Data orchestration tools (e.g., Airflow, Azure Data Factory)
Python
SQL
Shell scripting (Bash)
Data modeling and database operations
ETL frameworks

Tools

Docker
Git
Terraform
Job description

We are currently hiring for the role of Senior Data Engineer to work directly with government agencies, helping them enhance their development delivery efficiency and system resiliency through advanced CI/CD and Site Reliability Engineering (SRE) practices.

This is a forward-deployed, customer-facing role, where you’ll collaborate with partner agency teams to solve complex challenges, architect data solutions, and drive continuous improvements across infrastructure, tooling, and processes.

Contract : 1 year extendable.

Key Responsibilities
  • Collaborate with agency IT teams to ensure alignment on tech stack, security, and infrastructure.
  • Translate business data requirements into technical specifications and implement scalable solutions.
  • Design and build data ingestion pipelines to collect, clean, and harmonize data from various sources.
  • Monitor and maintain databases and ETL systems (capacity planning, performance tuning, etc.).
  • Develop reusable data models and implement secure access mechanisms for data warehouse users.
  • Apply data governance, metadata management, and lineage tracking.
  • Perform data quality checks and enforce data security policies.
  • Contribute to product evolution by gathering feedback and suggesting new tools, technologies, or improvements.
Required Skills & Experience
  • Proven experience with cloud platforms: AWS, Azure, or GCP.
  • Strong knowledge of data engineering and pipeline development (batch & real-time).
  • Hands-on experience with Databricks.
  • Experience with data orchestration tools (e.g., Airflow, Azure Data Factory).
  • Proficiency in Python, SQL, and shell scripting (Bash).
  • Strong understanding of data modeling, database operations, and ETL frameworks.
Preferred Qualifications
  • Familiarity with CI/CD pipelines and DevOps tools (Docker, Git, Terraform).
  • Knowledge of data governance, security, and compliance in public sector environments.
  • Experience working with government systems and their data policies.
  • Domain experience in climate or weather-related datasets is a bonus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.