Job Search and Career Advice Platform

Enable job alerts via email!

Data Architect APAC

Zurich Insurance

Singapore

On-site

SGD 100,000 - 140,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading global insurance company is seeking a Senior Data Engineer/Data Architect to design and deliver next-generation Generative AI platform solutions. The ideal candidate will have over 8 years of experience in large-scale distributed systems, especially in cloud-native AI/ML workloads. Responsibilities include leading projects, writing production-grade code, and mentoring engineers. This position offers opportunities to innovate and grow within a supportive and diverse environment.

Qualifications

  • 8+ years of experience building data platforms or large-scale distributed systems.
  • At least 3 years in cloud-native AI/ML workloads.
  • Proven delivery of GenAI/LLM products in production.

Responsibilities

  • Design and deliver next-generation Generative AI platform solutions.
  • Lead multi-component projects and mentor engineers.
  • Write production-grade code and build ingestion pipelines.

Skills

Python
Data governance
Communication skills
Docker
SQL

Tools

Azure Redis
Terraform
GitHub Actions
Job description
Overview

Who we areLooking for a career that will excite, challenge and inspire you? Thinking about insurance? Perhaps you should. Working for us is a totally different experience to what you probably expect. How do you feel about the things you truly love? Don’t you want to protect them in the best way possible? Imagine if you could help people do this all over the world. You’d give them confidence and reassurance by protecting what they love most. This is no easy task. In today’s interconnected world, tackling risk is fast, unpredictable and invigorating. You’ll have to think on your feet as you manage risks big and small, from flooding to cyber-crime. You’ll be tackling issues like these in over 170 countries. It’s a big challenge, but you’ll have a truly diverse network helping you. As part of an international team, every day would provide opportunities to learn, grow and share ideas

As you make an impact across borders, you’ll feel the support of being part of a strong and stable company. A long-standing player in the insurance industry, we make every effort to address the career development needs and plans of our employees to ensure their success in the future

So make a difference. Be challenged. Be inspired. Be supported, Love what you do. Work for us

Zurich Insurance has the policy to be an equal opportunity employer. We aim to attract and retain the best qualified individuals available, without regard to criteria such as race/ethnicity, national origin, religion, gender, sexual orientation, age or disability

At Zurich we believe that having a culture of inclusion is essential in delivering good results. Attracting, retaining and developing a diverse workforce where employees feel valued, respected and empowered allows people to reach their full potential. As a business this diversity helps us to better reflect and understand our 4 million customers’ needs to allow us to drive better outcomes. As a global organisation, with an increasingly agile workforce, we\'re happy to consider flexible working arrangements

Our opportunity

We are building the next‑generation Generative AI platform solution that will power multiple business lines and connect to upstream and downstream systems

Your role

As our Senior Data Engineer / Data Architect you will own the end‑to‑end design and delivery of this platform—architecting, coding, shipping, and scaling it in production in coordination with our Data Scientists. This is a hands‑on leadership role: we are looking for builders, not slide‑makers

  • Architecture & Design
  • Shape the overall data & GenAI architecture (batch / streaming pipelines, vector stores, feature stores, LLM‑Ops tooling, API layer)
  • Select frameworks, storage engines, orchestration tools, and security patterns that will stand the test of scale and time
  • Hands‑on Engineering
  • Write production‑grade code
  • Build ingestion pipelines, retrieval‑augmented‑generation (RAG) services, micro‑services, and CI/CD workflows
  • Conduct code reviews, performance tuning, and zero‑downtime releases
  • Run and monitor large‑scale AI workloads in the cloud (Azure / AWS)
  • Delivery Leadership
  • Lead multi‑component projects
  • Break down epics, estimate effort, and remove blockers for the squad
  • Report progress and trade‑offs directly to senior business & tech stakeholders
  • Mentoring & Culture
  • Coach engineers on best practices in data engineering, DevOps, and LLM‑Ops
  • Foster a “get‑stuff‑done” culture—passion, autonomy, accountability

As a Data Scientist your skills and qualifications will ideally include:

  • 8+ years building data platforms or large‑scale distributed systems, with at least 3 years in cloud‑native AI / ML workloads
  • Strong experience in Python
  • Proven delivery of GenAI / LLM products in production: RAG pipelines, vector databases (Pinecone, Milvus, FAISS, etc.), LangChain/LlamaIndex/ LangChain or equivalent, prompt orchestration, fine‑tuning, and model monitoring
  • Mastery of modern data tooling: SQL & NoSQL stores, Docker , Terraform, GitHub Actions/Azure DevOps
  • Strong grasp of data governance, security, and privacy for regulated industries (insurance / financial services a plus)
  • Design and implement systems for managing agent long- and short-term memory, including structured logging of interactions and collection of user feedback, to support continuous learning and performance improvement using technologies such as Azure Redis or PostgreSQL
  • Develop and enforce app and user-level authorization and authentication mechanisms for agent APIs to ensure secure data access and prevent unauthorized exposure across business units
  • Track record of sticky retention—you build, iterate, and scale products over several years
  • Experience with AI Agents Observability & Performance Monitoring: reporting on key performance metrics to maintain reliability and optimize outcomes
  • Excellent communication skills; able to translate architectural trade‑offs to both engineers and executives
  • Experience with Model Context Protocol (MCP) workflows, standardization of AI-driven tool calling and workflows
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.