Job Search and Career Advice Platform

Enable job alerts via email!

Junior Data Engineer

TESCOM (SINGAPORE) SOFTWARE SYSTEMS TESTING PTE LTD.

Singapore

On-site

SGD 60,000 - 90,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions provider in Singapore is looking for a Data Engineer to develop and maintain scalable data systems. Candidates should possess strong programming skills in Python and SQL, with experience in both relational and NoSQL databases. Proficiency in cloud platforms like AWS, Azure, or GCP is essential. This role involves collaborating with data teams to implement data governance and troubleshoot issues in data workflows. The ideal candidate should be eager to learn and adapt in a fast-paced environment.

Qualifications

  • Strong Python and SQL skills are fundamental.
  • Experience with relational and NoSQL databases.
  • Proficiency in major cloud providers for data services.
  • Working knowledge of big data tools like Apache Spark and Hadoop.
  • Experience in designing data extraction and loading processes.

Responsibilities

  • Develop, construct, test, and maintain scalable data systems and architectures.
  • Collect raw data and transform it into usable formats.
  • Collaborate with data scientists and analysts to meet data requirements.
  • Implement data governance and security measures.

Skills

Python
SQL
Java
Scala
PostgreSQL
MySQL
MongoDB
AWS
Azure
GCP
Apache Spark
Kafka
Hadoop
Apache Airflow
Redshift
Snowflake

Education

Polytechnic Diploma or bachelor’s degree in computer science, Data Analytics, Business Intelligence or any related field

Tools

GitHub
GitLab
Job description
Key Responsibilities
  • Develop, construct, test, and maintain scalable data systems, pipelines, and architectures (data lakes, warehouses). Collect raw data from various sources, clean it, and transform it into usable formats (ETL/ELT). Work with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. Improve existing data frameworks, monitor workflows, and troubleshoot issues. Implement data governance, validation, and security measures.
Qualifications & Skills
Education
  • Polytechnic Diploma or bachelor’s degree in computer science, Data Analytics, Business Intelligence or any related field
Experience
  • Programming: Strong Python and SQL skills are fundamental, with Java/Scala useful for big data. Databases: Experience with relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB). Cloud Platforms: Proficiency in major cloud providers (AWS, Azure, GCP) for data services (S3, Redshift, EMR, etc.).
  • Big Data: Working knowledge of tools like Apache Spark, Kafka, and Hadoop. ETL/ELT: Designing and implementing data extraction, transformation, and loading processes. Data Warehousing: Concepts and tools for storing large datasets (e.g., Redshift, Snowflake). Orchestration: Tools like Apache Airflow for scheduling and managing workflows. Data Modeling: Designing efficient database and data warehouse schemas.
Technical Skills
  • Programming: Python, Java, SQL (essential). Databases: Relational (SQL) & NoSQL databases. Big Data Tools: Hadoop, Spark (often). Cloud Platforms: AWS, Azure, GCP (increasingly common). ETL/ELT Tools: Expertise in data integration tools. Software Engineering: Strong understanding of data structures and algorithms
Soft Skills
  • Strong analytical and problem-solving abilities.
  • Good communication skills to collaborate with non-technical teams.
  • Eagerness to learn and adapt in a fast-paced environment.
  • Open to learn new software or BI tools in the market.
Nice to Have
  • Knowledge of GitHub, GitLab
  • Knowledge of SDLC & DDLC
  • Understanding of Agile/Scrum methodologies.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.