Enable job alerts via email!

Data Engineer

PSA Singapore

Singapore

On-site

SGD 70,000 - 90,000

Full time

Today
Be an early applicant

Job summary

A leading port logistics company in Singapore is seeking a skilled Data Engineer for the Insights, Digitalization & Analytics department. The candidate will design and maintain scalable data solutions, work with cloud technologies, and implement data pipelines. Ideal candidates have a bachelor's degree in computer science, 2-3 years of relevant experience, and a strong proficiency in data tools and analysis. Join our dynamic team and contribute to innovative projects!

Qualifications

  • 2-3 years of experience in data engineering or software engineering.
  • Strong data analysis and problem-solving abilities.
  • Analytical, meticulous, and self-motivated.

Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines and API endpoints.
  • Monitor and optimize data quality and performance.
  • Collaborate with Data Scientists to maintain ML/AI projects.

Skills

Data analysis
Problem-solving
Team collaboration
Effective communication

Education

Bachelor's degree in Computer Science or related field

Tools

MS SQL Server
ETL tools (e.g., Microsoft SSIS)
Power BI
Azure DevOps
Job description
Overview

We are the World's Port of Call. Our winning formula is our People.

In our continuing journey to build great teams, we are looking for passionate individuals driven by a strong sense of purpose. It is only with the determination and commitment of our People that we can serve our customers, lead our industry and contribute to our nation to create new possibilities.

Working Alongside, we can deliver extraordinary results together! Join #TeamPSA today!

DATA ENGINEER

Job no: 493706

Work type: Permanent

Categories: Infocomm Technology

About the Role

We are seeking a skilled Data Engineer to join the Insights, Digitalization & Analytics department. The ideal candidate will design, develop, and maintain scalable data solutions to drive analytics, machine learning, and business insights. Responsibilities include building data pipelines, APIs, and dashboards, deploying Machine Learning projects, and leveraging big data technologies and cloud platforms.

Responsibilities
  • Design, develop, and maintain ETL/ELT pipelines, API endpoints, and data applications across cloud and on-premise environments, integrating internal and external data sources, including web scraping of public data (ensuring compliance).
  • Monitor, optimize, and maintain data quality, performance, and availability through data cleansing, transformation, and deployment of scalable solutions.
  • Design, implement, and manage Azure cloud infrastructure to support scalable and secure deployment of cloud-native applications, including configuration of networking, storage, compute resources, and identity management. Ensure alignment with best practices for performance, cost optimization, and security compliance.
  • Implement CI/CD pipelines, automate deployments, and ensure scalability and performance for data and Machine Learning (ML)/AI solutions.
  • Create and optimize interactive dashboards to visualize business metrics, collaborating with stakeholders to design user-friendly interfaces and integrate them with backend data pipelines.
  • Collaborate with Data Scientists to deploy, monitor, and maintain ML/AI projects in production systems.
Requirements
  • Possess a bachelor’s degree in Computer Science, Computer Engineering, or a related field (specialization in Software Engineering is a plus).
  • 2–3 years of experience in data engineering or software engineering with expertise in data warehousing, big data platforms, cloud technologies, and automation tools
  • Strong data analysis, data verification and problem-solving abilities.
  • Analytical, meticulous, and team player.
  • Effective communication skills for collaboration across teams.
  • Ability to manage multiple tasks in a dynamic environment.
  • Self-motivated and possess initiative to learn new skills and technologies.
Technical Skills
  • Proficiency in data warehouse design including relational databases (MS SQL Server), NoSQL, and ETL pipelines using Python or ETL tools (e.g., Microsoft SSIS, Informatica IPC) and data warehousing concepts, database optimization, and data governance.
  • Familiarity with Python web application and API development tools (e.g., Flask, Requests) and web scraping tools (e.g., BeautifulSoup, Scrapy).
  • Skilled in Power BI, including DAX and Power Query, for creating reports and dashboards.
  • Experience with architecting and implementing Microsoft Azure services, including Azure Data Factory, Data Lake Storage, App Service, and Azure SQL, as well as CI/CD pipelines using Azure DevOps.
  • Knowledge of Machine Learning tools (e.g., AutoML platforms like Azure AutoML or DataRobot) and ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch, Keras).
  • Familiarity with big data technologies (e.g., Hadoop, Hive, Spark) and Databricks platform.

Advertised: 01 Oct 2025

Applications close: 31 Dec 2025

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.