Enable job alerts via email!

Data Platform Engineer

TEAMLEASE DIGITAL CONSULTING PTE. LTD.

Singapore

On-site

SGD 70,000 - 120,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in Singapore is seeking a Data Platform Engineer to manage and optimize global data solutions. The ideal candidate will have extensive experience in distributed systems, automation, and agile methodologies. This role requires strong technical skills in Python, Java, and data technologies, with a focus on building resilient data pipelines and collaborating with cross-functional teams to ensure high quality data delivery.

Qualifications

  • 5+ years in large-scale distributed systems design and implementation.
  • Proficiency in Python and Java or similar languages.
  • Experience with Hadoop ecosystem and cloud migration.

Responsibilities

  • Manage Global Data Platform components and applications.
  • Automate infrastructure and CI/CD for data pipelines.
  • Collaborate with security and engineering teams for solution design.

Skills

Python
Java
Agile Project Management
Kubernetes
Kafka
Spark
DevOps
Data Migration
Continuous Integration
Machine Learning

Education

Bachelor’s degree in Computer Science or related field

Tools

Docker
Jenkins
Octopus
Dataiku

Job description

Professional and Technical

  • Bachelor’s degree in a relevant field such as Computer Science, Information Technology, Engineering or related areas
  • At least 5 years of experience in building or designing large-scale, fault-tolerant, distributed systems (for example: data lakes, delta lakes, data meshes, data lake houses, data platforms, data streaming solutions…)
  • In-depth knowledge and experience in one or more large-scale distributed technologies, including but not limited to: Hadoop ecosystem, Kafka, Kubernetes, Spark.
  • Migration experience of storage technologies (e.g. HDFS to S3 Object Storage)
  • Integration of streaming and file-based data ingestion/consumption (Kafka, Control M, AWA)
  • Experience in DevOps, data pipeline development, and automation using Jenkins and Octopus (optional: Ansible, Chef, XL Release, and XL Deploy)
  • Expert in Python and Java or another static language like Scala/R/R, Linux/Unix scripting, Jinja templates, puppet scripts, firewall config rules setup • VM setup and scaling (pods), K8S scaling, managing Docker with Harbor, pushing Images through CI/CD
  • Experience working with data formats such as Apache Parquet, ORC, or Avro. Experience in machine learning algorithms is a plus.
  • Hands-on experience in integrating Data Science Workbench platforms (e.g. Dataiku)
  • Cloud migration experience might come in handy • Experience of agile project management and methods (e.g., Scrum, SAFe)
  • Knowledge of the financial sector and its products is beneficial

Responsibilities:

  • Operating Global Data Platform components (VM Servers, Kubernetes, Kafka) and applications (Apache stack, Collibra, Dataiku and similar).
  • Implement automation of infrastructure, security components, and Continuous Integration & Continuous Delivery for optimal execution of data pipelines (ELT/ETL).
  • Develop solutions to build resiliency in data pipelines through platform health checks, monitoring, and alerting mechanisms, thereby improving the quality, timeliness, recency, and accuracy of data delivery. Apply DevSecOps & Agile approaches to deliver a holistic and integrated solution in iterative increments.
  • Liaise and collaborate with enterprise security, digital engineering, and cloud operations to gain consensus on architecture solution frameworks.
  • Review system issues, incidents, and alerts to identify root causes and continuously implement features to improve platform performance.
  • Stay current on the latest industry developments and technology trends to effectively lead and design new features and capabilities.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.