Enable job alerts via email!

Cloud Analytics Engineer | Kuala Lumpur, MY

Prudential plc

Kuala Lumpur

On-site

MYR 80,000 - 120,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company in financial services seeks a Cloud Analytics Engineer to join its Cloud Economics and Intelligence team. The role requires expertise in data engineering, analytics, and cloud technologies, with a focus on Azure Data Factory and machine learning applications. Candidates must be passionate about IT security automation and data analytics programming, enhancing operational efficiencies through innovative data solutions. Join us in shaping a secure and efficient cloud future, supporting diverse career ambitions.

Qualifications

  • 4+ years in data engineering and analytics; 2 years in cloud environment.
  • Proficiency in Azure Data Factory, Azure Databricks, and strong Python coding skills.
  • Experience with containerization and best practices development.

Responsibilities

  • Design and optimize data pipelines and analytics workflows using Azure Databricks.
  • Develop and support data pipelines for cloud cost optimization.
  • Collaborate with AI/ML engineers to integrate Gen-AI capabilities into analytics.

Skills

Data engineering
Data analytics
Python
Azure Data Factory
CI/CD
Machine learning
Data visualization
Problem-solving

Education

Bachelor's or Master's in Computer Science

Tools

Azure Databricks
SQL
Kubernetes
Terraform
Power BI

Job description

Prudential's purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people's career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.

This role will be part of the Group-Wide Information Security team under the Security Metrics and Analytics function, with the primary responsibility of developing and implementing in-house data analytics and security automation programs. These initiatives aim to strengthen overall information security controls through automated discovery of security risks, orchestration of risk reduction exercises, and visualization of risk reduction and exposures. The scope covers data security, vulnerabilities management, network security, threat analysis, and software development.

To succeed, candidates must have a strong passion for IT security automation, machine learning, and data analytics programming using Python, Keras, and TensorFlow.

We are seeking a highly motivated Cloud Analytics Engineer to join our Cloud Economics and Intelligence team within Group Technology Infrastructure Engineering. This role combines data engineering, analytics, and emerging AI technologies to optimize cloud investments and business outcomes.

You will primarily work on Azure Data Factory, Azure Databricks, develop analytical models, and contribute to Gen-AI chatbot/LLM initiatives to enhance operational efficiencies and internal use cases.

Key Responsibilities:

  1. Design and optimize data pipelines and analytics workflows using Azure Databricks, Spark, and Delta Lake.
  2. Develop and support data pipelines, workflows, schedulers, and data schemas/tables that feed dashboards, reports, and data models for cloud cost optimization and operational stability.
  3. Collaborate with AI/ML engineers to integrate Gen-AI capabilities into analytics tools and chatbots.
  4. Contribute to the development of agentic AI chatbots for internal use cases.
  5. Ensure data quality, governance, and performance across data pipelines and analytics platforms.
  6. Translate business requirements into scalable data and AI solutions.

Required Skills & Experience:

  1. 4+ years in data engineering and analytics, with at least 2 years in a cloud environment (Azure & GCP).
  2. Proficiency in Azure Data Factory, Azure Databricks, PySpark, SQL, GCP Cloud Functions, and BigQuery.
  3. Experience with CI/CD, version control (GitHub, Azure DevOps), and best practices development.
  4. Strong Python programming skills for data manipulation and automation.
  5. Experience with containerization (Kubernetes) and infrastructure provisioning (Terraform).
  6. Familiarity with Azure bot service, generative AI, and agentic chatbot frameworks (e.g., LangChain, Mosaic AI, MS AutoGen).
  7. Knowledge of LLMs, embeddings, and prompt engineering is preferred.
  8. Experience with data visualization tools like Power BI and MS Fabric pipelines is a plus.

Preferred Qualifications:

  1. Bachelor's or Master's in Computer Science, Data Science, Engineering, or related fields.
  2. Self-driven, critical thinker, team player, and quick learner.
  3. Strong analytical and problem-solving skills, capable of working independently and collaboratively.

Prudential is an equal opportunity employer. We offer benefits regardless of sex, race, age, ethnicity, education, social background, marital status, pregnancy, religion, disability, or employment status. We support reasonable adjustments for individuals with health requirements.

Boost your career

Find thousands of job opportunities by signing up to eFinancialCareers today.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.