Enable job alerts via email!

Cloud & Data Engineering

DDT SOFTWARE & E-COMM (OPC) PVT LTD

Singapore

On-site

SGD 70,000 - 120,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company in the software and e-commerce sector is looking for two Cloud & Data Engineering Specialists. These roles focus on developing scalable cloud solutions and data pipelines, requiring extensive experience with cloud platforms and data analytics tools. Candidates should be proficient in Python and SQL, with strong analytical skills and a collaborative mindset. This is a fantastic opportunity to join a dynamic team committed to delivering innovative data solutions.

Qualifications

  • 5+ years of experience in cloud and data engineering roles.
  • Proven ability to deliver scalable data solutions.
  • Strong problem-solving and analytical skills.

Responsibilities

  • Design and implement cloud solutions across Azure, AWS, and GCP platforms.
  • Develop and optimize data pipelines using PySpark, Python, and SQL.
  • Build and manage ETL workflows using Azure Data Factory.

Skills

Cloud platforms proficiency
Programming skills in Python
SQL
PySpark
ETL tools expertise
Data analysis skills

Education

Bachelor's degree in Computer Science, Engineering, or related field

Tools

Azure Data Factory
Apache Spark
Databricks
Tableau
Power BI
Git
Docker

Job description

    We are seeking two highly skilled Cloud & Data EngineeringSpecialists to join our dynamic team. These roles will focus ondesigning, building, and optimizing scalable cloud-based solutions, datapipelines, and analytics platforms. The ideal candidates will havestrong expertise in cloud platforms, data engineering, and moderntechnologies, with a focus on delivering robust, secure, and efficientdata solutions.Location: Off-Shore (India)Work-Hours: Overlap till 12pm CSTKey Responsibilities: Design and implement cloud solutions across Azure, AWS,and GCP platforms. Develop and optimize data pipelines using PySpark, Python,and SQL. Build and manage ETL workflows using Azure Data Factory(ADF). Work with big data technologies such as ApacheSpark and Databricks to process large datasets. Design and deliver dashboards and reportsusing Tableau and Power BI. Implement DevOps practices, including version control with Git,CI/CD pipelines, and containerization using Docker. Collaborate with stakeholders to gather requirements and deliverscalable data solutions.Key Skills: Proficiency in Azure, AWS, and GCP cloud platforms. Strong programming skills in Python, SQL, and PySpark. Experience with Snowflake and SQL Server databases. Expertise in ETL tools like Azure Data Factory (ADF). Hands-on experience with Apache Spark and Databricks forbig data processing. Proficiency in reporting tools such as Tableau and Power BI. Knowledge of DevOps practices, including Git, CI/CD pipelines,and Docker.General Requirements: Bachelors degree in Computer Science, Engineering, or a relatedfield (or equivalent experience). 5+ years of experience in cloud and data engineering roles. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Proven ability to work in a fast-paced, agile environment.,
  • Recruiter Details DDT SOFTWARE & E-COMM (OPC) PVT LTD
  • Job Tags sql, docker, gcp, aws, apache spark
Sign-in & see how your skills match this job

Sign-in & Get noticed by top recruiters and get hired fast

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.