Enable job alerts via email!

Data Engineer (IDMC, AWS, Redshift, ETL, Databricks)

NEPTUNEZ SINGAPORE PTE. LTD.

Singapore

On-site

SGD 85,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A leading data solutions firm in Singapore is seeking an experienced Data Engineer to design and maintain robust data pipelines and ETL processes. The ideal candidate has a strong background in AWS cloud services, Databricks, and data modeling. You will be responsible for optimizing data workflows and ensuring data quality in a dynamic environment. This role offers a competitive salary and opportunities for professional development.

Qualifications

  • Minimum 5 years of experience in data warehousing, big data, or advanced analytics solutions.
  • Hands-on experience with Databricks (Delta Lake, MLflow, Spark).
  • Strong knowledge of AWS cloud services (e.g., AWS Glue, Redshift, S3, Lambda, Kinesis, Athena, EMR).
  • CI/CD & DevOps best practices for data pipelines and cloud infrastructure.

Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes.
  • Define and implement data models and warehouse architectures for analytics.
  • Develop and optimize ETL workflows using IDMC.
  • Integrate and process large-scale data using Big Data technologies.
  • Collaborate with data governance teams to maintain metadata and compliance.

Skills

Data warehousing
Big Data technologies
SQL
ETL/ELT workflows
AWS cloud services
Databricks
Data visualization tools
CI/CD & DevOps
Data modeling
Infrastructure as Code

Education

AWS Certified Solutions Architect – Associate or Professional
Databricks Certified Data Engineer
Informatica IDMC Certification

Tools

AWS Glue
Redshift
S3
Terraform
Power BI
Tableau
Job description
Responsibilities
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes for structured and unstructured data.
  • Define and implement data models and warehouse architectures (star/snowflake schema) for analytics and reporting.
  • Develop and optimize ETL workflows using IDMC (Informatica Intelligent Data Management Cloud)
  • Perform data ingestion, transformation, and loading across multiple sources like Oracle, MSSQL, MySQL, and Teradata. Manage and maintain databases across platforms (Oracle, MSSQL, MySQL, Teradata).
  • Write and optimize complex SQL queries, stored procedures, and performance tuning for large datasets
  • Integrate and process large-scale data using Big Data technologies (e.g., Hadoop, Spark, Hive, or cloud equivalents like AWS Glue, Redshift, S3, Azure Data Factory, or GCP Dataflow).
  • Implement data validation, profiling, and quality checks to ensure accuracy and reliability.
  • Collaborate with data governance teams to maintain metadata, lineage, and compliance with security standards. Document data flows, integration processes, and design specifications for maintainability.
  • Support migration and modernization initiatives (e.g. moving from on-premises DWH to cloud-based systems).
Requirements
  • Minimum 5 years of experience in data warehousing, big data, or advanced analytics solutions.
  • Experience with databases (e.g., Oracle, MS SQL, MySQL, Teradata, Databricks).
  • Expertise in data repository design (e.g., operational data stores, data marts, data lakes).
  • Proficiency in data query techniques (e.g., SQL, NoSQL, Spark SQL).
  • Hands-on experience with Databricks (Delta Lake, MLflow, Spark).
  • Experience with Informatica Data Management Cloud (IDMC) for data integration, transformation, and governance.
  • Must-have: Strong knowledge of AWS cloud services (e.g., AWS Glue, Redshift, S3, Lambda, Kinesis, Athena, EMR).
  • Experience in building and optimizing ETL/ELT workflows using AWS native tools, Databricks, or IDMC.
  • Understanding of event-driven architectures and microservices.
  • Data modeling experience (e.g., Star Schema, Snowflake Schema).
  • Experience in data visualization tools (e.g., Power BI, Tableau).
  • Infrastructure as Code(IaC): Terraform, CloudFormation.
  • CI/CD & DevOps best practices for data pipelines and cloud infrastructure.
  • Identity and Access Management (IAM), security best practices, and data governance.
  • AWS Certified Solutions Architect – Associate or Professional
  • Databricks Certified Data Engineer
  • Informatica IDMC Certification
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.