Job Search and Career Advice Platform

Enable job alerts via email!

Senior/Data Engineer

Adecco

Kuala Lumpur

On-site

MYR 70,000 - 90,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading recruitment agency is seeking a Data Engineer to provision and configure Microsoft Fabric capacities and manage data integrations. Responsibilities include establishing connectivity to external data sources, designing ingestion pipelines, and supporting production deployments. The ideal candidate must have strong proficiency in Azure, Databricks, and Python, along with experience in managing data quality and transformation processes. This role is essential for ensuring the reliability and scalability of data operations.

Qualifications

  • Proven experience as a Data Engineer working with Microsoft Fabric and Azure Data Services.
  • Strong knowledge of data ingestion and transformation frameworks.
  • Hands-on experience supporting UAT and production deployments.

Responsibilities

  • Establish and ensure secure connectivity between data platforms and external sources.
  • Design ingestion pipelines for structured and unstructured data.
  • Produce clear documentation of data architecture and operations.

Skills

Azure
Databricks
Python
SQL
Data Modelling
Data Ingestion
Data Transformation
Data Orchestration

Tools

Microsoft Fabric
Azure Data Services
Job description

Provision and configure Microsoft Fabric capacities, workspaces, semantic models, and supporting Azure services (e.g., Storage, Key Vault).

Deploy the Agile Insights Fabric Admin and Governance Pack and the ETL framework to support robust data platform operations.

Data Integration & Manipulation

Establish connectivity between the data platform and external data sources, ensuring secure and reliable access.

Design and implement ingestion pipelines for structured/unstructured data.

Develop data transformations across layers (bronze → silver → gold) to ensure scalable, high-quality, and business-ready datasets.

Manage reference data integration and ensure alignment across domains.

Validation & Governance

Support UAT cycles, identifying and addressing data quality and integration issues.

Facilitate Change Advisory Board (CAB) endorsement, ensuring architecture and deployment readiness.

Production Deployment & Knowledge Transfer

Deploy solutions into production with proactive monitoring and operational continuity.

Produce clear as-built documentation of data architecture, pipelines, and processes.

Conduct knowledge transfer sessions with client teams to ensure sustainable operations.

Requirements:

Must have: Azure, Databricks & Python

Proven experience as a Data Engineer working with Microsoft Fabric, Azure Data Services, and modern data platforms.

Strong knowledge of data ingestion, transformation, and orchestration frameworks.

Experience building data pipelines across bronze, silver, and gold layers.

Proficiency in SQL, data modelling, and handling structured/unstructured data.

Familiarity with security, governance, and deployment best practices in enterprise data environments.

Hands-on experience supporting UAT, CAB processes, and production deployments.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.