Job Search and Career Advice Platform

Enable job alerts via email!

Senior Principal Data Engineering Lead

Cygnify Pte Ltd

Singapore

On-site

SGD 100,000 - 140,000

Full time

7 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions firm in Singapore is looking for a Senior Principal Data Engineering Lead to drive data engineering functions and ensure excellence in cloud-native data delivery. The position involves overseeing team leadership, managing data quality processes, and setting engineering standards. Ideal candidates will have 8-12 years of experience in data engineering, strong AWS knowledge, and the ability to mentor diverse teams. This role offers a dynamic environment focused on building scalable, high-quality data solutions.

Qualifications

  • 8–12 years of experience in cloud-native data engineering.
  • Proven leadership of cross-functional engineering teams.
  • Deep hands-on technical expertise in Snowflake and AWS services.

Responsibilities

  • Lead recruitment and mentoring of data teams.
  • Oversee design, development, and optimization of data pipelines.
  • Manage daily pipeline operations and SLA compliance.
  • Set engineering standards for observability and CI/CD.
  • Enable self-service analytics by curating trusted datasets.

Skills

Cloud-native data engineering
Leadership
AWS architecture
Data ingestion and transformation
Python
SQL
Data orchestration tools
Data quality frameworks

Tools

Snowflake
Airflow
Fivetran
Airbyte
AWS services
Job description

Role: Senior Principal Data Engineering Lead
Location: Singapore

To lead and scale the Data Engineering, DataOps and Data Stewardship functions within the Data organization. This role ensures end-to-end delivery excellence of the cloud-native data platform – spanning data ingestion, transformation, modeling, and operations – to enable reliable, high-quality, and self-service analytics across business domains.

Responsibilities
  • Team Leadership: Recruit, mentor, and lead a hybrid team of data engineers and stewards across Singapore, Malaysia and India, establishing in-house technical leadership and delivery ownership.
  • Data Engineering Delivery: Oversee design, development, and optimization of ELT/ETL pipelines and data models, ensuring scalable, reusable, and cost-efficient workflows.
  • Data Quality & Stewardship: Institutionalize stewardship processes — define ownership models, implement DQ monitoring, and drive remediation workflows with cross-functional data users.
  • Operational Excellence: Manage daily pipeline operations, SLA compliance, and production issue resolution with strong root-cause analysis and continuous improvement.
  • Technical Governance: Set engineering standards for observability, RBAC, cost tagging, and CI/CD practices.
  • Collaboration & Enablement: Enable self-service analytics by curating trusted datasets and modeled views, working with BI and business teams.
Qualifications
  • 8–12 years of experience in cloud-native data engineering, with strong architecture and delivery experience on AWS.
  • Proven leadership of cross-functional and hybrid engineering teams, including vendor-augmented resources.
  • Experience partnering with BI and business teams to design modelled datasets and enable self-service analytics.
  • Deep hands‑on technical expertise, including: Snowflake: schema design, Streams/Tasks, Stored Procedures, UDFs, RBAC, performance tuning, Cortex AI, Streamlit, cost monitoring.
  • Airflow or similar data orchestration tools: orchestration, scheduling, dependency management, and observability.
  • Python and SQL: pipeline scripting, transformation logic, and data validation.
  • ELT/ETL frameworks: Airbyte, Fivetran, and custom connector development.
  • AWS services: S3 (data lake structures and archival), Lambda, KMS, Transfer Family, CloudWatch, SageMaker.
  • Demonstrated success delivering medallion architecture (Bronze/Silver/Gold) and enabling self‑service data use cases.
  • Experience building data quality frameworks, stewardship policies, and data lineage tracking across enterprise datasets.
  • Familiarity with machine learning integration using platforms like AWS SageMaker.
  • Proven ability to troubleshoot complex data issues, lead root‑cause analysis, and ensure production stability.
  • Track record of transitioning delivery ownership from vendors to internal teams while maintaining quality and velocity.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.