Job Search and Career Advice Platform

Enable job alerts via email!

Manager, Data Engineering

CTOS

Kuala Lumpur

On-site

MYR 120,000 - 150,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading credit reporting agency in Kuala Lumpur is seeking a Data Management Leader to oversee data ingestion, management, and operations. The ideal candidate will have extensive experience with data platforms and cloud services. You will design and implement data pipelines, ensuring operational excellence and compliance. The role requires a strong background in analytics and proven leadership skills to drive the department’s vision. This position offers opportunities for significant impact and innovation in the financial industry.

Qualifications

  • Minimum 10 years’ experience in product/software development.
  • Minimum 5 years’ experience working with data platforms.
  • Strong knowledge of data platforms and cloud-native services.

Responsibilities

  • Oversee activities related to data ingestion and management.
  • Execute data extraction and analysis for client needs.
  • Design and develop data pipelines from multiple sources.

Skills

Data platforms
Data pipelines
Cloud services (AWS preferred)
SQL
Python/Pandas/PySpark
ETL tools (Talend)
Shell scripting
Monitoring tools (AirFlow)
Data reporting (Grafana, Power BI)
Automated data processes

Education

Bachelor's Degree in Computer Science, Information Technology or equivalent

Tools

Hadoop
Spark
Jupyter
Job description

We are Malaysia’s leading Credit Reporting Agency (CRA) and we are aggressively expanding our business, and looking for dynamic, driven and motivated individuals to join our team. Our Direct-To-Consumer segment (D2C), is one of our fastest growing product areas in the market, with an abundance of expansion plans and innovative ideas on hand.

ROLE OVERVIEW

This role serves as a key leadership position within the department, overseeing all activities related to data ingestion, management of datasets, transformation, extraction, and operations to support both internal and external client needs.

The role is also responsible for shaping the department’s vision and culture, while providing technical expertise and driving a strong focus on operational excellence.

KEY RESPONSIBILITIES
  • Execute ad-hoc, monthly and quarterly data extraction services as required by CTOS internal/external clients and in accordance with services SLA
  • Perform QA activities related to client data services
  • Execute data analysis to provide insight for internal/external needs
  • Automate recurrent data related activities
  • Investigate data to answer customer complaints related to CTOS services (for example score complain, credit report complaint etc…)
Data pipeline development and operationalization
  • Design data pipelines to ingest data from multiple sources and implement quality processes
  • Design processes to create data insight, data features and operationalize analytics models
  • Develop and implement those data pipelines
  • Operationalize data pipelines for automated and semi-automated execution
  • Implement reporting of data universe, data quality across the organization
  • Study business requirements and gather business rules to standardize attributes in warehouse.
  • Perform code reviews to ensure data lake implementation standards are followed.
  • Establish and maintain documentation in terms of business rules and requisites, data dictionary, pipeline definition, data flow, etc.
  • Own the Data Platform, suggest, specify, and implement enhancements to the platform
  • Suggest and implement technology components relevant to the data platform
  • Manage data lake infra & services in terms of capacity scalability, usage, performance, cost utilization, change requests, parameter configuration, etc.
  • Support key initiatives including migration of CTOS data services to the cloud
  • Maintain awareness of industry trends and regulatory compliance, implement proactive controls to mitigate risks.
WHAT DOES IT TAKE TO BE SUCCESSFUL
Qualifications
  • A Bachelor's Degree, Post Graduate Diploma, or Professional Degree in Computer Science, Information Technology, or equivalent
Work Experience
  • Minimum 10 years’ experience in product/software development
  • Minimum 5 years’ experience working with data platforms
  • Experience working within the financial industry is a plus
  • Strong knowledge of data platforms, data pipelines, and data-related development including Hadoop, Spark, Jupyter, and cloud-native services
  • Proficient in cloud services and data-related cloud-native services (AWS preferred)
  • Proficient in SQL, Python/Pandas/PySpark for implementing data pipelines
  • Skilled in automation and monitoring tools such as Shell scripts and scheduling tools like AirFlow
  • Experience with ETL tools such as Talend and/or cloud-native ETL services
  • Capable of implementing reporting using data platforms or reporting software
  • Able to implement data reporting solutions using platforms such as Grafana, Power BI
  • Excellent written and verbal communication skills in English
  • Able to multi-task and prioritize activities effectively
  • Understanding of financial industry data sources (e.g., CCRIS, SSM, Angkasa, Idaman) is a plus
Leadership & Interpersonal Attributes
  • Excellent organizational, operational, and leadership skills
  • Strong analytical and problem-solving capabilities
  • Strategic and business-oriented mindset
  • Outstanding communication and interpersonal abilities
  • Ability to communicate complex concepts clearly and understandably
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.