Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Analyst (Data Engineering)

gradmalaysia.com

Kuala Lumpur

On-site

MYR 120,000 - 160,000

Full time

26 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech recruiting platform in Kuala Lumpur seeks a skilled Data Architect responsible for designing and optimizing enterprise data architecture including data lakes and warehouses. The ideal candidate will possess at least 7 years in data engineering, with 3+ years in a leadership role, and expertise in modern data technologies. Candidates should be proficient in SQL, Python, and familiar with cloud platforms. This role supports data-driven business intelligence initiatives and involves collaboration across multiple teams.

Qualifications

  • At least 7 years of experience in data engineering and architecture.
  • 3+ years in a leadership or architectural role.
  • Strong hands-on experience in data warehouse platforms.

Responsibilities

  • Manage the end-to-end design and development of enterprise data architecture.
  • Drive adoption of modern data architectures ensuring efficient data retrieval.
  • Design and optimize ETL/ELT data pipelines for analytics.

Skills

Modern data technologies
Cloud platforms
SQL
Python
Data modeling tools
Real-time data processing
Data governance

Education

Bachelor’s Degree in Computer Science, Data Science, Data Engineering

Tools

IBM DataStage
Informatica
Talend
Airflow
Kafka
Spark
Job description

You will be responsible for the design, development, and optimisation of the enterprise data architecture, including data warehouses, pipelines, models, and cloud-based platforms. This role is expected to ensure a high-quality, scalable, and secure data platform that supports business intelligence (BI), advanced analytics, and enterprise-wide data integration. You will provide recommendations on technology direction, advocate for modern data architecture principles, and foster a culture of data engineering excellence and innovation.

You will work closely with data analysts, BI developers, data scientists, and IT teams to drive best practices in data engineering and contribute to the organisation’s broader journey toward becoming a fully insight-driven and AI-enabled enterprise.

JOB SCOPE
  • Manage the end-to-end design and development of enterprise data architecture, including data lakes, data lakehouse, data warehouses, and integration layers in both on-premise and cloud environments.
  • Define and maintain data models, schemas, storage frameworks, and access strategies to support BI, reporting, and analytics use cases.
  • Drive the adoption of modern data architectures (e.g., lakehouse, data mesh, streaming-first), ensuring efficient data storage and retrieval performance.
  • Establish guiding design principles, architectural patterns, and best practices to ensure scalability, reusability, security, and performance.
  • Design, build, and optimize scalable ETL/ELT data pipelines to support analytics and business intelligence use cases across the organisation.
  • Develop and manage data ingestion, transformation, and processing workflows for both structured and unstructured data.
  • Ensure real-time and batch data processing capabilities are optimised for business responsiveness.
  • Implement robust monitoring, alerting, and observability for pipelines to ensure high system reliability and performance.
  • Manage the development of API-based data integrations between internal systems and third-party platforms.
  • Integrate enterprise applications such as ERP, CRM, IoT platforms, and external data sources into the central data platform.
  • Enable seamless data interoperability across business units, supporting cross-functional data consumption and analytics.
  • Partner closely with data analysts, BI teams, data scientists, and IT infrastructure teams to ensure alignment between data architecture and business requirements.
  • Collaborate with IT and cybersecurity functions to ensure adherence to data governance, privacy, and compliance standards.
  • Translate business data needs into scalable technical solutions.
  • Optimize storage, indexing, and retrieval to support efficient, low-latency query performance.
  • Troubleshoot data quality and latency issues, applying best practices in resilience and fault tolerance.
  • Implement automated data quality checks, validation rules, and exception handling mechanisms.
  • Stay abreast of emerging trends in cloud computing, big data, and modern data engineering, guiding the adoption of new tools and practices.
REQUIRED COMPETENCIES
  • Demonstrates the capability to apply modern data technologies, architectures, and platforms to solve business challenges.
  • Able to evaluate, adopt, and integrate appropriate tools (e.g., cloud platforms, orchestration engines, data lakes) to ensure scalability, performance, and innovation in data infrastructure.
  • Capable of aligning data architecture and platform design with long-term enterprise goals.
  • Anticipates future trends in data and analytics, identifies risks and opportunities, and translates them into actionable strategies and scalable designs.
  • Able to interpret and simplify complex, multi-source data ecosystems into modular, maintainable components.
  • Demonstrates sound judgement in navigating integration, transformation, and optimisation across various technologies and business domains.
  • Effectively designs and improves engineering processes, tools, and standards to deliver high-quality data products at scale.
  • Applies automation, reusability, and DevOps principles to optimise performance, reduce redundancy, and enhance delivery speed.
  • Demonstrates strong cross-functional collaboration skills by engaging business, IT, and analytics teams to align on shared goals.
  • Translates business data requirements into scalable solutions while building trust and fostering partnership across teams.
  • Actively explores new technologies, patterns, and methods to improve data platform capabilities. Encourages experimentation and ideation, embedding innovation into architecture decisions and delivery practices.
QUALIFICATIONS
  • Bachelor’s Degree in Computer Science, Data Science, Data Engineering, Information Systems, or equivalent qualification from accredited higher learning institutions.
  • At least seven (7) years of experience in data engineering and architecture, including 3+ years in a leadership or architectural role.
  • Strong hands‑on experience in data warehouse platform (IBM NPS) with ETL/ELT frameworks (e.g. IBM DataStage, Informatica, Talend), orchestration tools (e.g. Airflow), and data integration platforms.
  • Demonstrated expertise in cloud platforms (AWS, Azure, or GCP) and modern data warehouse technologies (e.g. Snowflake, BigQuery, Redshift, Synapse).
  • Proficiency in SQL, Python, and data modeling tools; experience with real‑time data processing (e.g. Kafka, Spark) is a strong advantage.
  • Familiarity with data governance frameworks, metadata management, and data security policies.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.