Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Neutron

Singapore

On-site

SGD 70,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech company in Singapore is searching for a Senior Data Engineer to lead AWS migrations, design scalable data systems, and ensure data governance. Candidates should possess proficiency in Python, AWS services, and strong communication skills, with a minimum of 5 years of experience in data engineering or cloud infrastructure. This position offers a dynamic work environment and the opportunity to train junior engineers.

Qualifications

  • 5–7+ years of experience in data engineering or cloud infrastructure.
  • Experience with data pipeline development.
  • Relevant certifications such as AWS Certified Data Engineer are advantageous.

Responsibilities

  • Lead migration to cloud-based data platform.
  • Develop data lakes and warehouses.
  • Implement data validation and governance processes.
  • Mentor junior engineers and conduct training sessions.
  • Stay updated with emerging technologies.

Skills

Proficiency in Python
Familiarity with Java or Scala
Strong knowledge of SQL
Experience with AWS services
Experience with version control (Git)
Understanding serverless architectures
Knowledge of data governance
Strong communication skills

Education

Bachelor’s degree in Computer Science or related field

Tools

AWS services (S3, Lambda, Glue, etc.)
BI tools (Tableau, Power BI, QuickSight)
Job description
Senior Data Engineer / AWS Migration Lead
Key Responsibilities
1. Migration & Technical Implementation
  • Lead migration of on-premises and SharePoint/legacy systems to our cloud-based data platform.
  • Collaborate with cloud engineering teams to develop migration strategies and implementation plans.
  • Re-architect existing Python scripts and UiPath automation workflows for cloud-native services.
  • Design and implement scalable ETL/ELT pipelines and data integration workflows.
  • Migrate Tableau dashboards to modern BI platforms such as QuickSight, ensuring full functionality.
  • Establish data connections from multiple enterprise systems.
  • Ensure seamless transition with minimal disruption.
2. Data Infrastructure & Architecture
  • Guide and plan data engineering processes across digital systems.
  • Develop and maintain data lakes, warehouses, and database infrastructure.
  • Define data architecture standards and DevOps best practices.
  • Support the development of pipelines integrating data from multiple platforms.
3. Data Quality & Governance
  • Implement data validation, cleansing, and harmonisation processes to ensure integrity.
  • Monitor pipelines for issues and performance bottlenecks.
  • Support data governance, access control, and compliance needs.
4. System Maintenance & Documentation
  • Maintain cloud and on-prem data infrastructure.
  • Troubleshoot and resolve data-related issues.
  • Create and maintain documentation for engineering processes.
5. Stakeholder Collaboration
  • Work with product teams, analysts, and stakeholders to translate business requirements.
  • Develop and deploy data tables, marts, and visulisation layers for reporting.
  • Support users in accessing analytics dashboards.
6. Training & Knowledge Transfer
  • Mentor and train junior engineers.
  • Conduct knowledge-sharing sessions.
  • Prepare training materials for new systems.
7. Innovation & Continuous Improvement
  • Stay updated with emerging technologies and contribute to process enhancements.
  • Participate in continuous learning and internal capability development.
Skills & Knowledge
  • Proficiency in Python, and familiarity with Java or Scala.
  • Strong knowledge of SQL, data modelling, schema design, and ETL/ELT processes.
  • Experience with AWS services such as S3, Lambda, Glue, SageMaker, Athena, RDS, QuickSight.
  • Familiarity with big data and analytics frameworks (e.g., Spark, Databricks).
  • Experience with version control (Git) and DevOps pipelines (GitLab, CI/CD, Nexus).
  • Understanding serverless architectures and Infrastructure-as-Code (CloudFormation, YAML/JSON).
  • Knowledge of data governance, security best practices, and enterprise access controls.
  • Experience with BI tools such as Tableau, Power BI, or QuickSight.
  • Strong communication, analytical thinking, and stakeholder management skills.
Requirements
  • Bachelor’s degree in Computer Science, Data Science, Engineering, IT, or related field.
  • 5–7+ years of experience in data engineering, cloud infrastructure, or platform engineering.
  • Experience with cloud platforms (preferably AWS) and data pipeline development.
  • Experience with data visualisation and business intelligence tools.
  • Familiarity with agile development methodologies.
  • Relevant certifications such as AWS Certified Data Engineer or Databricks Certified Data Engineer are advantageous.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.