Enable job alerts via email!

Data Engineer

Zenika Pte. Ltd.

Singapore

Hybrid

SGD 70,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A leading data consultancy in Singapore is seeking a Data Engineer Consultant to design and implement data platforms. You will work hands-on with AWS and Databricks, developing scalable data solutions. The ideal candidate has 5–7 years of experience in data engineering, strong technical skills, and excellent communication abilities. This role offers a hybrid work arrangement with comprehensive benefits and learning opportunities.

Benefits

20 days of annual leave + 5 LEAP days
Hybrid work arrangement
Dedicated Learning & Development budget
Flexible benefits package
Comprehensive international medical insurance

Qualifications

  • 5–7 years of experience in data engineering, ideally on AWS-native platforms.
  • Proven experience building ETL/ELT pipelines and managing real-time streaming architectures.
  • Hands-on expertise in Databricks, PySpark, Python, and SQL for large-scale data transformation.
  • Strong understanding of data governance principles.
  • Familiarity with CI/CD workflows and version control (GitLab, GitHub).
  • Solid grasp of data security, performance optimisation, and cost management.

Responsibilities

  • Design and build enterprise-scale data architectures.
  • Develop and maintain high-performance ETL/ELT pipelines.
  • Implement and optimise data transformations in Databricks.
  • Collaborate and mentor junior engineers.
  • Integrate and automate data flows using APIs and AWS services.
  • Monitor and troubleshoot performance bottlenecks.

Skills

Data Engineering
AWS
Databricks
PySpark
Python
SQL
Data Warehouse Design
DevOps
ETL/ELT
Data Security

Education

Bachelor’s degree in Computer Science, Data Engineering, or a related field
Master’s degree (preferred)
AWS Certified Data Engineer – Associate
Databricks Certified Data Engineer

Tools

AWS Glue
Amazon Redshift
AWS S3
AWS Lambda
Kafka
Kinesis
Job description
Is there a Zenika in you?

Let’s talk skills and passion first.

You’re a data enthusiast who thrives on transforming complexity into clarity. With deep technical expertise in AWS and Databricks, you design robust, scalable, and high-performance data pipelines that power intelligent decisions. Curious by nature, you’re constantly exploring new tools, automation techniques, and architectures to deliver meaningful data solutions that scale.

Your Role as a Zenika Consultant:

As a Data Engineer Consultant, you’ll play a key role in designing and implementing data platforms for our clients — particularly in public sector and enterprise environments. You’ll work hands-on with technologies like AWS, Databricks, and PySpark, and collaborate with cross-functional teams to deliver scalable, production-ready data solutions.

You’ll work on projects that will allow you to:

  • Design and build enterprise-scale data architectures — including data lakes, warehouses, and real-time streaming pipelines.
  • Develop and maintain high-performance ETL/ELT pipelines that process large volumes of structured and unstructured data.
  • Implement and optimise data transformations in Databricks using PySpark, Python, and SQL, ensuring quality, scalability, and cost-efficiency.
  • Collaborate and mentor — guide junior engineers, review code, and drive adoption of best practices in data engineering and DevOps.
  • Integrate and automate data flows using APIs, AWS services (Glue, Redshift, S3, EMR, Lambda), and real-time tools like Kafka or Kinesis.
  • Monitor and troubleshoot performance bottlenecks, ensuring reliability, consistency, and security across all data operations.

What You Bring

  • 5–7 years of experience in data engineering, ideally on AWS-native platforms.
  • Proven experience building ETL/ELT pipelines, migrating data solutions, and managing real-time streaming architectures.
  • Hands-on expertise in Databricks, PySpark, Python, and SQL for large-scale data transformation.
  • Strong understanding of data warehouse design (Redshift, Snowflake) and data governance principles.
  • Familiarity with DevOps concepts, including CI/CD workflows and version control (GitLab, GitHub).
  • Experience with serverless compute (AWS Lambda, Azure Functions) and automation scripting.
  • Solid grasp of data security, performance optimisation, and cost management in cloud environments.
  • Excellent communication and collaboration skills — you translate complex technical details into clear business language.
  • Bonus: Experience with public sector or HR analytics projects, as well as mentoring and leading data teams.
  • Bachelor’s degree in Computer Science, Data Engineering, or a related field (Master’s preferred).
  • Certifications such as:AWS Certified Data Engineer – Associate or AWS Certified Solutions Architect, Databricks Certified Data Engineer (Associate/Professional), Snowflake or Redshift certifications (a plus), Agile/Scrum Master certification (preferred)
About Zenika

Founded by developer Carl Azoury, Zenika is a consultancy built around community, transparency, and craftsmanship. We are passionate technophiles advising clients with deep expertise in open-source tech and modern solutions.

Why Join Zenika?
  • Work with a global client base across 11 locations, benefiting from over 28,000 Zenika-led training sessions
  • Partner with tech giants like Google Cloud and Scrum.org, and engage in research, open-source contributions, and conferences outside client projects
  • Participate in Zenika tech conferences (TechnoZaures) to learn, share, and grow together
  • Hybrid work arrangement
  • 20 days of annual leave + up to 5 LEAP(Learning, Education, Advancement, Progress) days
  • Dedicated Learning & Development (L&D) budget to support your growth
  • Flexible benefits package to cater to your well-being and lifestyle needs
  • Comprehensive international medical insurance package

Ready to code your story with us? Apply NOW!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.