Job Search and Career Advice Platform

Enable job alerts via email!

Junior Data Analyst

UPS ASIA GROUP PTE. LTD.

Singapore

Hybrid

SGD 60,000 - 80,000

Full time

17 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading logistics and transportation firm in Singapore is looking for a Junior Data Analyst to join their hybrid analytics and engineering team. The successful candidate will be responsible for performing exploratory data analysis, designing scalable data pipelines, and developing scripts for data transformation. Proficiency in Python and SQL, along with strong problem-solving skills, is essential. This role offers the opportunity to work closely with team members and stakeholders on data-driven initiatives.

Qualifications

  • Strong proficiency in Python, especially with Pandas and NumPy.
  • Experience with large datasets and cloud-based data platforms.
  • Hands-on experience with GCP services, especially Dataflow and BigQuery.

Responsibilities

  • Perform exploratory data analysis to identify trends and actionable insights.
  • Design and implement scalable data pipelines using Google Cloud Dataflow.
  • Develop scripts for data cleaning, transformation, and enrichment.

Skills

Python
SQL
Data analysis
Data visualization
Cloud data platforms
Statistical methods
Git

Education

Bachelor’s degree in Data Science, Computer Science, Statistics, or related field

Tools

Google Cloud Platform (GCP)
Apache Beam
Looker
Job description

We are seeking a highly analytical and technically skilled Junior Data Analyst with development capabilities to join our hybrid analytics and engineering team. This role is ideal for someone passionate about data exploration, statistical analysis, and building scalable data solutions. You will partner closely with team members and business stakeholders to analyze large datasets, design and build data pipelines, and contribute to internal tools and platforms. You will also play a key role in our data discovery journey, supporting initiatives in process optimization, automation, and robotics solutions.

Job Responsibilities
  • Perform exploratory data analysis to identify trends, patterns, and actionable insights from large datasets.
  • Design and implement scalable data pipelines using Google Cloud Dataflow and the Apache Beam framework.
  • Develop scripts and tools for data cleaning, transformation, and enrichment to support analytics and machine learning workflows.
  • Collaborate with stakeholders to understand business requirements and translate them into analytical solutions.
  • Create interactive dashboards and visualizations using Looker and other BI tools.
  • Work with Google Cloud Platform (GCP) services such as BigQuery, Cloud Functions, and Cloud Storage to manage and analyze data.
  • Support the development of internal applications and APIs that facilitate data access and usability.
  • Document data processes, methodologies, and technical specifications.
Job Requirements
  • Possess a Bachelor’s degree in Data Science, Computer Science, Statistics, or a related field; or equivalent practical experience.
  • Strong proficiency in Python (e.g., Pandas, NumPy) and SQL; experience with R is a plus.
  • Experience working with large datasets and cloud-based data platforms.
  • Hands‑on experience with GCP, especially Dataflow, BigQuery, and Cloud Functions.
  • Familiarity with Apache Beam for scalable data processing.
  • Proficiency with Looker or similar BI tools for visualisation and reporting.
  • Understanding of statistical methods and basic data modelling techniques.
  • Fundamental programming skills in Python, JavaScript, or Java for tooling/scripting.
  • Working knowledge of Git and collaborative development workflows.
  • Excellent problem‑solving, communication, and collaboration skills.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.