Job Search and Career Advice Platform

Enable job alerts via email!

SSIS / Data Management / Microsoft Fabric Developer

Innocel Services Sdn Bhd

Kuala Lumpur

On-site

MYR 150,000 - 200,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology services company in Kuala Lumpur is seeking motivated SSIS / Data Management Developers at various experience levels. The role focuses on data integration, ETL pipelines, and utilizing tools like SQL Server and Microsoft Fabric to support enterprise reporting and analytics. Successful candidates will work on designing, implementing, and maintaining data processes, ensuring data quality and security standards are met. Join a dynamic data engineering team and contribute to exciting data initiatives.

Qualifications

  • Strong understanding of SQL and relational databases.
  • Familiarity with SSIS and ETL methodologies.
  • Good analytical and problem-solving skills.

Responsibilities

  • Assist in developing and maintaining ETL pipelines using SSIS and SQL Server.
  • Collaborate with team members to troubleshoot ETL issues.
  • Lead the design of enterprise-grade ETL/data integration solutions.

Skills

SQL
Data integration
ETL pipelines
Microsoft Fabric
Data validation

Education

Bachelor’s Degree in Information Technology / Computer Science / Software Engineering

Tools

SSIS
SQL Server
Azure Data Factory
Power BI
Job description
SSIS / Data Management / Microsoft Fabric Developer

We are looking for passionate and driven SSIS / Data Management Developers with different experience levels (Fresh, Junior, Senior) to join our data engineering team.
The ideal candidate will work on data integration, ETL pipelines, Microsoft SQL Server, SSIS, Azure Data Services, and Microsoft Fabric to support enterprise reporting, analytics, and data warehousing initiatives.

2. Key Responsibilities (by Level)

Fresh Graduate / Junior Developer

Assist in developing and maintaining ETL pipelines using SSIS and SQL Server.

Support data ingestion into data warehouses / lakehouses under supervision.

Work with senior team members to troubleshoot ETL issues and perform data validation.

Perform basic SQL queries, stored procedures, and debugging tasks.

Learn and support Microsoft Fabric components (Data Pipelines, Dataflows Gen2, Lakehouse, Warehouse, Notebooks).

Document technical processes and follow best practices for data quality and security.

Mid-Level / Junior Developer (1–4 years’ experience)

Develop, enhance, and optimize SSIS packages, SQL-based ETL jobs, and scheduling workflows.

Build modular and reusable data pipelines for data integration and transformation.

Perform complex SQL queries, performance tuning, and data quality checks.

Implement DataOps practices such as logging, exception handling, and automation.

Participate in Microsoft Fabric projects including Lakehouse ingestion, Medallion architecture (Bronze/Silver/Gold), and Dataflows Gen2.

Collaborate with BI, data governance, and application teams to support analytics delivery.

Senior Developer (5+ years’ experience)

Lead the design and implementation of enterprise-grade ETL/data integration solutions.

Architect and implement end-to-end pipelines using SSIS, SQL Server, Azure Data Factory, and Microsoft Fabric.

Optimize large-scale data models and ensure data reliability, performance, and scalability.

Drive Medallion architecture in Fabric (Bronze/Silver/Gold/Platinum layers).

Mentor junior developers and enforce coding standards, DevOps practices, and documentation.

Conduct root-cause analysis on data quality issues and refine ETL frameworks.

Work closely with data architects and business stakeholders to deliver strategic data initiatives.

3. Required Skills & Qualifications (All Levels)

Bachelor’s Degree in Information Technology / Computer Science / Software Engineering or equivalent.

Strong understanding of SQL, relational databases, and data transformation logic.

Familiarity with SSIS, SQL Server Agent, and ETL methodologies.

Knowledge in data modelling, data cleansing, and data validation concepts.

Good analytical, problem-solving, and communication skills.

Ability to work in a fast-paced and collaborative environment.

4. Additional Requirements (By Level)

Fresh / Junior

Good fundamentals in SQL and database design.

Internship experience in data, analytics, or software development is a plus.

Willingness to learn new tools such as Microsoft Fabric, Azure Data Factory, and Power BI.

Mid-Level

Hands-on experience with SSIS, SQL Server, stored procedures, and ETL troubleshooting.

Experience with Azure Data Lake, ADF, or Fabric Data Pipelines is an advantage.

Senior

4+ years in data engineering, ETL, or DataOps roles.

Strong experience with large-scale datasets, DWH projects, and performance tuning.

Experience in Microsoft Fabric, Azure Synapse, or advanced cloud architectures.

Ability to lead, mentor, and manage complex data workflows.

Knowledge in automation, DevOps, CI/CD for data pipelines (Git, YAML pipelines, CI/CD in Fabric).

Be careful - Don’t provide your bank or credit card details when applying for jobs. Don't transfer any money or complete suspicious online surveys. If you see something suspicious, report this job ad.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.