Job Search and Career Advice Platform

Enable job alerts via email!

Data and AI Ops Engineer

Jobstreet Malaysia

Kuala Lumpur

On-site

MYR 100,000 - 150,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A financial services provider in Kuala Lumpur is looking for a Data & AI Ops Engineer to design and maintain scalable data pipelines. The role requires expertise in data architecture, API integration, and cloud platforms. Ideal candidates should have a Bachelor's in a related field or equivalent experience with a strong foundation in data engineering and operational support. Excellent problem-solving and communication skills are essential. A competitive salary and opportunities for professional growth are offered.

Qualifications

  • Minimum 6 years of hands-on experience in data engineering or data integration.
  • At least 3 years in designing and supporting enterprise-scale data pipelines.
  • Minimum 8 years of practical experience with a diploma.

Responsibilities

  • Design and maintain scalable data pipelines for analytics and reporting.
  • Ensure data quality and reliability through validation and transformation.
  • Collaborate with teams to define data requirements and solutions.
  • Optimize data storage and performance across cloud platforms.
  • Monitor and resolve data-related issues in accordance with SLAs.

Skills

Data architecture understanding
ETL/ELT pipeline design
Experience with RDBMS
Familiarity with API integrations
Knowledge of cloud platforms
Understanding of data governance
Awareness of AI/ML fundamentals
Experience in BI frameworks
Strong documentation skills
Problem-solving ability

Education

Bachelor’s Degree in Computer Science or related field
Diploma in a relevant discipline with significant experience

Tools

Azure Data Factory
Databricks
Power BI
Job description

Industry: Financial Services/Takaful (Islamic Insurance)

The Data & AI Ops Engineer provides end-to-end technical expertise in designing, building, and operating scalable data and AI-enabled platforms. This role ensures reliable, high-quality data pipelines, models, and integrations that support analytics, reporting, and AI-driven solutions across the organization.

The position plays a key role in data architecture, API-based integration, performance optimization, and operational support, ensuring data availability across dashboards, applications, and backend systems while meeting SLA requirements for Incident Requests (IR) and Service Requests (SR).

Job Description
  • Design, develop, and maintain scalable data pipelines and integration workflows for analytics, reporting, and operational systems
  • Ensure data quality, consistency, and reliability through validation, cleansing, and transformation processes
  • Collaborate with developers, business analysts, and system owners to define data requirements and deliver business-aligned solutions
  • Optimize data storage and retrieval performance across databases, data lakes, and cloud platforms for batch and real-time processing
  • Build, deploy, and enhance data models, APIs, and ETL/ELT frameworks in line with architecture and governance standards
  • Monitor and resolve data-related IRs and SRs in accordance with SLA and operational expectations
  • Support data-centric IT initiatives, including planning, coordination, risk mitigation, and stakeholder engagement
  • Maintain clear documentation for data flows, schemas, and integration logic to support audit, compliance, and knowledge sharing
  • Ensure seamless integration with core business systems and external platforms
  • Continuously evaluate and adopt emerging data and AI technologies to improve efficiency and solution performance
Key Skills & Requirements
  • Strong understanding of data architecture, data modeling, and ETL/ELT pipeline design
  • Experience with enterprise data platforms (RDBMS required; Azure Data Factory, Databricks, Power BI are a plus)
  • Familiarity with API-based integration (REST, SOAP, MQ)
  • Knowledge of cloud platforms (Azure, AWS, or GCP)
  • Understanding of data governance, security, privacy, lineage, and compliance requirements
  • Awareness of AI/ML fundamentals, including chatbot solutions, predictive analytics, and model deployment
  • Experience in BI and reporting frameworks for visualization and decision support
  • Strong documentation, communication, and stakeholder engagement skills
  • Ability to operate in dynamic environments, troubleshoot issues, and maintain operational stability
Education & Experience
  • Bachelor’s Degree in Computer Science, Information Systems, Data Analytics, or related field
  • Minimum 6 years hands-on experience in data engineering or data integration
  • At least 3 years designing and supporting enterprise-scale data pipelines
  • OR
  • Diploma in a relevant discipline
  • Minimum 8 years practical experience in data engineering or related domains
  • At least 4 years in a senior or technical lead role supporting cross-functional data initiatives
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.