Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer Delivery Lead

RAPSYS TECHNOLOGIES PTE. LTD.

Singapore

On-site

SGD 100,000 - 140,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company in Singapore is seeking an experienced professional for data platform delivery management. The role involves overseeing data mapping, vendor collaboration, and leading a team of data engineers. The ideal candidate has over 8 years of experience in data integration and cloud platforms. This position requires strong leadership skills and exceptional communication abilities.

Qualifications

  • 8+ years of hands-on experience in data integration or data platform development.
  • 3+ years in a lead role on complex data projects.
  • Expertise in creating detailed source-to-target data mapping documents.
  • Advanced proficiency in SQL.

Responsibilities

  • Support the design and implementation of a scalable data platform.
  • Manage timelines and deliverables to ensure project success.
  • Establish testing strategies for data quality during migration.
  • Provide leadership and mentorship to data engineers and analysts.

Skills

Data Integration
Data Warehousing
Cloud Platforms (AWS, Azure, GCP)
SQL
Vendor Management
Communication Skills
Data Mapping

Education

Bachelor’s degree in Computer Science or related field

Tools

dbt
Airflow
Talend
Informatica
Job description
Responsibilities
Data Platform Delivery Management
  • Support the design, development, and implementation of a scalable, reliable, and high-performance data platform, translating business needs into technical specifications.
  • Manage timelines, milestones, and deliverables to ensure successful project execution.
Data Mapping and Transformation
  • Take full ownership of the data mapping process, a critical component for the success of data migration and integration within the transformation program.
  • Analysis of source systems to define data flows from domain systems into the new platform and data migration requirements from legacy systems.
  • Manage the creation, review, and sign-off of detailed Source-to-Target Mapping (STM) specifications, ensuring accuracy and completeness.
  • Collaborate closely with Data Engineers, Data Architects, and Business Analysts to define and implement complex data transformation logic and business rules.
  • Establish and execute a robust testing strategy to guarantee data quality, consistency, and integrity throughout the migration lifecycle.
Stakeholder and Vendor Collaboration
  • Act as the primary technical point of contact for external vendors involved in the data platform build and data mapping activities.
  • Rigorously manage vendor performance, Statements of Work (SOWs), and deliverables to ensure alignment with project goals, timelines, and quality standards.
Leadership and Team Management
  • Provide technical leadership, guidance, and mentorship to a team of internal and external data engineers, analysts, and developers.
  • Champion best practices in data governance, data quality, and data security across the team.
  • Foster a culture of excellence, collaboration, and innovation within the data delivery team, utilizing Agile/Scrum methodologies to maximize productivity.
Qualifications & Experience
Required
  • Bachelor’s degree in computer science, Engineering, Information Systems, or a related field.
  • 8+ years of hands‑on experience in data integration, data warehousing, or data platform development.
  • 3+ years of experience in a lead role, successfully delivering large-scale, complex data projects, particularly involving data migration and system integration.
  • Expertise in Data Mapping: Proven, extensive experience creating detailed source-to-target data mapping documents based on complex business requirements.
  • Cloud Platform Proficiency: Deep knowledge and practical experience with at least one major cloud platform: AWS (e.g., S3, Glue, Redshift), Azure (e.g., ADLS, Data Factory, Synapse), or GCP (e.g., Cloud Storage, Dataflow, BigQuery).
  • Modern Data Stack: Strong understanding of modern data warehousing technologies like Snowflake, BigQuery, or Redshift.
  • ETL/ELT Experience: Hands‑on experience with data pipeline and orchestration tools such as dbt, Airflow, Talend, or Informatica.
  • Vendor Management: Demonstrated experience managing relationships, contracts, and performance of third‑party technology vendors.
  • Communication Skills: Exceptional ability to communicate complex technical concepts to both technical and non‑technical audiences.
  • Advanced proficiency in SQL.
Preferred
  • Master’s degree in a relevant field.
  • Professional cloud certifications (e.g., AWS Certified Data Analytics, Google Professional Data Engineer).
  • Programming experience with Python or Scala for data processing.
  • Knowledge of data governance frameworks (e.g., DAMA‑DMBOK).
  • Experience working in an Agile/Scrum environment, with Scrum Master experience being a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.