
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A data engineering company in Kuala Lumpur seeks passionate and driven SSIS/Data Management Developers at all experience levels (Fresh, Junior, Senior). The role involves developing ETL pipelines, data integration, and working with Microsoft SQL Server and Azure Data Services. Ideal candidates should hold a degree in Information Technology or related fields and possess strong SQL skills. Opportunities to learn and work with Microsoft Fabric are available, fostering professional growth in a collaborative environment.
We are looking for passionate and driven SSIS / Data Management Developers with different experience levels (Fresh, Junior, Senior) to join our data engineering team.
The ideal candidate will work on data integration, ETL pipelines, Microsoft SQL Server, SSIS, Azure Data Services, and Microsoft Fabric to support enterprise reporting, analytics, and data warehousing initiatives.
Assist in developing and maintaining ETL pipelines using SSIS and SQL Server.
Support data ingestion into data warehouses / lakehouses under supervision.
Work with senior team members to troubleshoot ETL issues and perform data validation.
Perform basic SQL queries, stored procedures, and debugging tasks.
Learn and support Microsoft Fabric components (Data Pipelines, Dataflows Gen2, Lakehouse, Warehouse, Notebooks).
Document technical processes and follow best practices for data quality and security.
Develop, enhance, and optimize SSIS packages, SQL-based ETL jobs, and scheduling workflows.
Build modular and reusable data pipelines for data integration and transformation.
Perform complex SQL queries, performance tuning, and data quality checks.
Implement DataOps practices such as logging, exception handling, and automation.
Participate in Microsoft Fabric projects including Lakehouse ingestion, Medallion architecture (Bronze/Silver/Gold), and Dataflows Gen2.
Collaborate with BI, data governance, and application teams to support analytics delivery.
Lead the design and implementation of enterprise‑grade ETL/data integration solutions.
Architect and implement end‑to‑end pipelines using SSIS, SQL Server, Azure Data Factory, and Microsoft Fabric.
Optimize large‑scale data models and ensure data reliability, performance, and scalability.
Drive Medallion architecture in Fabric (Bronze/Silver/Gold/Platinum layers).
Mentor junior developers and enforce coding standards, DevOps practices, and documentation.
Conduct root‑cause analysis on data quality issues and refine ETL frameworks.
Work closely with data architects and business stakeholders to deliver strategic data initiatives.
Bachelor’s Degree in Information Technology / Computer Science / Software Engineering or equivalent.
Strong understanding of SQL, relational databases, and data transformation logic.
Familiarity with SSIS, SQL Server Agent, and ETL methodologies.
Knowledge in data modelling, data cleansing, and data validation concepts.
Good analytical, problem‑solving, and communication skills.
Ability to work in a fast‑paced and collaborative environment.
Good fundamentals in SQL and database design.
Internship experience in data, analytics, or software development is a plus.
Willingness to learn new tools such as Microsoft Fabric, Azure Data Factory, and Power BI.
Hands‑on experience with SSIS, SQL Server, stored procedures, and ETL troubleshooting.
Experience with Azure Data Lake, ADF, or Fabric Data Pipelines is an advantage.
4+ years in data engineering, ETL, or DataOps roles.
Strong experience with large‑scale datasets, DWH projects, and performance tuning.
Experience in Microsoft Fabric, Azure Synapse, or advanced cloud architectures.
Ability to lead, mentor, and manage complex data workflows.
Knowledge in automation, DevOps, CI/CD for data pipelines (Git, YAML pipelines, CI/CD in Fabric).