Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

BASE CAMP DIGITAL PTE. LTD.

Singapore

On-site

SGD 70,000 - 100,000

Full time

5 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company in Singapore is seeking a Data Engineer to design and manage data pipelines while leveraging AWS services. The role involves optimizing workflows and collaborating with teams to apply data solutions effectively. The ideal candidate should possess strong skills in AWS, Python, PySpark, and SQL to ensure high-performance data processing. Familiarity with Databricks is a plus. Join us to enhance our data capabilities in a dynamic environment.

Qualifications

  • Practical experience working with core AWS services used in data engineering.
  • Strong programming skills for data processing, automation, and scripting.
  • In-depth knowledge of Spark and distributed data processing.
  • Ability to write complex SQL queries and optimize database performance.

Responsibilities

  • Design, build, and manage robust data pipelines and ETL workflows using PySpark.
  • Develop and deploy scalable data solutions leveraging AWS services.
  • Optimize data processing workflows for performance, reliability, and efficiency.
  • Collaborate with team members to deliver scalable data solutions.
  • Maintain data security and ensure compliance with standards.

Skills

AWS
Python
PySpark
SQL

Tools

Databricks
Job description

Data Engineer

Skill sets:

  • AWS
  • Python
  • Pyspark
  • SQL
  • DataBricks(Good to have)
Key Responsibilities
  • Design, build, and manage robust data pipelines and ETL workflows using PySpark on distributed computing environments.
  • Develop and deploy scalable data solutions leveraging AWS services such as S3, EMR, and Lambda.
  • Optimize data processing workflows to ensure high performance, reliability, and efficiency.
  • Collaborate closely with data engineers, analysts, and business teams to deliver effective and scalable data solutions.
  • Maintain data security and ensure adherence to organizational compliance standards.
Required Skills
  • AWS: Practical experience working with core AWS services used in data engineering.
  • Python: Strong programming skills for data processing, automation, and scripting.
  • PySpark: In-depth knowledge of Spark and distributed data processing.
  • SQL: Ability to write complex queries and optimize database performance.
Nice to Have
  • Experience working with Databricks for large-scale data analytics.
  • Understanding of CI/CD workflows and DevOps practices.
  • Familiarity with data lake and data warehouse architectures.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.