Job Search and Career Advice Platform

Enable job alerts via email!

AWS Pyspark Developer

BASE CAMP RECRUITMENT PTE. LTD.

Singapore

On-site

SGD 60,000 - 80,000

Full time

5 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading recruitment firm in Singapore is seeking a Data Engineer for a 12-month contract. The role involves designing, developing, and maintaining data pipelines using PySpark, as well as implementing scalable solutions on AWS. Strong programming skills in Python and expertise in SQL are essential. Experience with Databricks and knowledge of CI/CD practices are advantageous. This position offers competitive compensation and the possibility of contract extension.

Benefits

Competitive Compensation

Qualifications

  • Hands-on experience with core AWS services for data engineering.
  • Strong programming skills for data processing and automation.
  • Expertise in distributed data processing and Spark framework.
  • Proficiency in writing complex SQL queries and optimizing performance.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes.
  • Implement scalable solutions on AWS cloud services.
  • Optimize data workflows for performance and reliability.
  • Collaborate with stakeholders to deliver high-quality solutions.
  • Ensure data security and compliance with standards.

Skills

AWS
Python
PySpark
SQL

Tools

Databricks
Job description
What’s on Offer:
  • Industry: Consulting
  • Location: Singapore
  • 12 months contract role (with the possibility of extension)
  • Competitive Compensation
Job Description:
  • Design, develop, and maintain data pipelines and ETL processes using PySpark on distributedsystems.
  • Implement scalable solutions on AWS cloud services (e.g., S3, EMR, Lambda).
  • Optimize data workflows for performance and reliability.
  • Collaborate with data engineers, analysts, and business stakeholders to deliver high-qualitysolutions.
  • Ensure data security and compliance with organizational standards.
Job Requirements:
  • AWS: Hands-on experience with core AWS services for data engineering.
  • Python: Strong programming skills for data processing and automation.
  • PySpark: Expertise in distributed data processing and Spark framework.
  • SQL: Proficiency in writing complex queries and optimizing performance
Good to Have:
  • Databricks: Experience with Databricks platform for big data analytics.
  • Knowledge of CI/CD pipelines and DevOps practices.
  • Familiarity with data lake and data warehouse concepts
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.