
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A consulting firm in Singapore is seeking a Data Platform Technical Lead to design and operate a cloud-based data platform on Azure Databricks. Responsibilities include leading platform design, building data pipelines, optimizing performance, and enforcing data engineering standards. The ideal candidate has at least 5 years of experience in data engineering with hands-on skills in Azure technologies. This hybrid role requires good communication and collaboration with stakeholders, alongside a flexible work arrangement.
We are looking for a Data Platform Technical Lead to drive the design, build, and operation of a modern cloud-based data platform on Microsoft Azure and Databricks.
This role is ideal for a hands‑on senior data engineer who enjoys architecting scalable data solutions, setting engineering standards, and guiding teams on best practices—without being a people manager.
You will work closely with data engineers, application teams, and business stakeholders to deliver a secure, reliable, and high‑performing data platform that supports analytics and business decision‑making.
Lead the technical design and evolution of an enterprise Azure Databricks data platform
Design and implement data ingestion, transformation, and orchestration pipelines
Build and maintain Lakehouse architecture (Bronze / Silver / Gold layers)
Implement data governance, access control, and metadata management using Unity Catalog
Collaborate with source system teams to deliver secure and reliable data integrations
Optimise platform performance, scalability, and cloud cost efficiency
Define and enforce data engineering standards, CI/CD practices, and operational guidelines
Troubleshoot complex platform issues and drive continuous improvements
Provide technical guidance and mentoring to data engineers and analysts
Translate business requirements into scalable technical data solutions
Minimum 5 years of experience in data engineering or data platform development
Strong hands‑on experience with:
Proven experience with Azure Data Factory
Good understanding of distributed data processing and Lakehouse design
Experience building scalable ETL / ELT pipelines
Familiarity with CI/CD pipelines (Azure DevOps or GitHub Actions)
Strong communication and stakeholder engagement skills
Working Hours: Monday to Friday, 9:00 AM – 6:00 PM . Work Arrangement: Hybrid (2 days onsite, 3 days remote). Work Location: Easily accessible to public transport, West Singapore
#DataEngineering #DataPlatform #AzureDatabricks #AzureDataFactory #UnityCatalog #PySpark #DeltaLake