Job Search and Career Advice Platform

Enable job alerts via email!

Remote Data Engineer: Spark, Databricks & Cloud Pipelines

Jobgether

Remote

MYR 60,000 - 85,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A recruitment firm is seeking a Data Engineer to work remotely. This role involves designing and implementing scalable data solutions that support business decisions. The ideal candidate will have experience in developing data pipelines, using technologies like Databricks, SQL, and Python. Responsibilities include building ETL workflows and managing data lakes on cloud platforms (Azure, AWS). The position offers flexibility, professional growth opportunities, and a collaborative work culture.

Benefits

Flexible remote working conditions
Opportunities for professional growth
Collaborative company culture
Access to modern technologies
Health and wellness benefits
Work‑life balance

Qualifications

  • 3–6 years of experience in Data Engineering or related roles.
  • Hands‑on experience with big data processing frameworks and data lakes.
  • Strong understanding of distributed systems and big data technologies.

Responsibilities

  • Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
  • Work on cloud platforms (Azure, AWS) to build and manage data lakes and scalable architectures.

Skills

Python
SQL
Databricks
Apache Spark
PySpark
Azure
AWS
ETL
CI/CD
Git

Education

Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field

Tools

Azure Data Factory
AWS Glue
Jenkins
Azure DevOps
Job description
A recruitment firm is seeking a Data Engineer to work remotely. This role involves designing and implementing scalable data solutions that support business decisions. The ideal candidate will have experience in developing data pipelines, using technologies like Databricks, SQL, and Python. Responsibilities include building ETL workflows and managing data lakes on cloud platforms (Azure, AWS). The position offers flexibility, professional growth opportunities, and a collaborative work culture.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.