Enable job alerts via email!

Data & AI Senior Engineer (Snowflake)

OMRON ASIA PACIFIC PTE LTD

Singapore

On-site

SGD 70,000 - 90,000

Full time

Today
Be an early applicant

Job summary

A technology company in Singapore is seeking a skilled Data Engineer to develop Snowflake-based solutions supporting AI and Data initiatives. The ideal candidate should have experience in ETL/ELT processes and modern cloud architecture. Responsibilities include designing data pipelines, ensuring compliance, and collaborating with business stakeholders. This role offers a dynamic work environment focused on innovation.

Qualifications

  • 4–5 years of hands-on experience implementing Snowflake data cloud.
  • Experience with cloud technologies such as AWS and Azure.
  • Strong programming skills in Python and/or Java.

Responsibilities

  • Design and implement ETL/ELT processes.
  • Develop scalable data pipelines.
  • Ensure data governance and compliance.

Skills

Snowflake Data Cloud
ETL/ELT tools
AWS Glue
Python
SQL
Data modelling
MLOps
Data governance

Education

Bachelor’s or Master’s degree in Computer Science or related field

Tools

Snowflake
AWS RDS
Docker
Job description
Overview

Seeking a skilled Data Engineer to lead Snowflake-based solutions supporting AI and Data initiatives. This role drives scalable infrastructure, legacy data warehouse migration, and robust ETL/ELT pipelines. The ideal candidate will have hands-on experience implementing Snowflake Data Cloud, strong ETL/ELT capabilities, and a proven ability to collaborate with business stakeholders. The engineer ensures secure, high-performance architecture aligned with AI workflows, while enforcing governance, quality, and seamless integration across platforms.

Design Responsibilities
  • Analyse internally and externally sourced raw data to generate BI and Advanced Analytics datasets based on stakeholder requirements.
  • Design scalable data pipelines to curate sourced data into the in-house data warehouse.
  • Develop data marts to facilitate dataset consumption by business and IT stakeholders.
  • Propose data model changes that align with in-house data warehouse standards.
  • Define and execute migration activities from legacy databases to the Snowflake-based data warehouse.
  • Implement and manage Snowflake governance (access control, data security, usage monitoring).
  • Support AI use cases through data preparation and integration.
  • Collaborate with cross-functional teams to deliver data-driven solutions.
Engineering Responsibilities
  • Data Pipeline Development: Design and implement ETL/ELT processes for AI and ML models.
  • Infrastructure Management: Select and manage cloud-based data storage solutions.
  • Model Deployment Support: Prepare datasets and environments for training and deploying AI models.
  • Real-Time Analytics: Handle unstructured data and support real-time data processing.
  • AI-Specific Tools: Work with vector databases, LLM pipelines, and frameworks like TensorFlow or PyTorch.
Collaboration & Governance
  • Collaborate with data scientists, ML engineers, and business stakeholders.
  • Ensure data governance, security, and compliance.
  • Monitor and optimise AI model performance.
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related quantitative discipline.
  • 4–5 years of hands-on experience implementing Snowflake data cloud in a production environment.
  • Strong expertise in ETL/ELT tools and frameworks (e.g., AWS Glue, dbt, Talend, Informatica).
  • Experience with MLOps tools like MLflow, Docker, and LangChain.
  • Proven experience in data warehouse migration and cloud data architecture.
  • Solid understanding of data modelling, SQL, and BI/Analytics concepts.
  • Experience working closely with business teams to deliver data solutions aligned with strategic goals.
  • Familiarity with Snowflake governance best practices.
  • Certified in SnowPro Data Engineer.
  • Deep knowledge of Snowflake performance optimisation, governance, and security.
  • Experience with cloud technologies such as AWS RDS, AWS Fargate, AWS S3 and Azure.
  • Familiarity with PostgreSQL, MS SQL, and other relational databases.
  • Strong programming skills in Python and/or Java.
  • Understanding of LLMs, RAG pipelines, and generative AI deployment.
  • Strong problem-solving and analytical thinking.
  • Excellent communication and stakeholder engagement skills.
  • Ability to work independently and manage multiple priorities.
  • Proactive mindset with a focus on continuous improvement.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.