Job Search and Career Advice Platform

Enable job alerts via email!

Data & AI Senior Engineer

Omron Electronics S.p.A.

Singapore

On-site

SGD 80,000 - 100,000

Full time

9 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading electronics company in Singapore is seeking an experienced Data Engineer to lead Snowflake-based solutions supporting AI and data initiatives. Responsibilities include designing scalable data pipelines, collaborating with cross-functional teams, and ensuring data governance and security. The ideal candidate has a degree in Computer Science and 4–5 years of experience in Snowflake, along with strong skills in ETL/ELT tools and cloud architecture.

Qualifications

  • 4–5 years of hands-on experience with Snowflake in a production environment.
  • Experience in data warehouse migration and cloud data architecture.
  • Strong programming skills in Python and/or Java.

Responsibilities

  • Design and implement scalable data pipelines for AI and ML.
  • Develop data marts for business consumption.
  • Support data governance processes and ensure compliance.

Skills

Snowflake data cloud implementation
ETL/ELT tools expertise
Cloud data architecture
Data modelling and SQL
Programming in Python or Java
MLOps tools experience
Data warehouse migration
Strong problem-solving skills
Excellent communication skills

Education

Bachelor’s or Master’s degree in Computer Science, IT, or related field

Tools

AWS Glue
dbt
Talend
Informatica
MLflow
Docker
LangChain
SnowPro Data Engineer certification
Job description
Overview

Business Company: OMRON Headquarters (HQ)

Location: Singapore, SG

Employment Type: Permanent

Responsibilities
  • Seek a skilled Data Engineer to lead Snowflake-based solutions supporting AI and Data initiatives. This role drives scalable infrastructure, legacy data warehouse migration, and robust ETL/ELT pipelines. The engineer ensures secure, high-performance architecture aligned with AI workflows, while enforcing governance, quality, and seamless integration across platforms.
  • Design Responsibilities: Analyze internally and externally sourced raw data to generate BI and Advanced Analytics datasets based on stakeholder requirements.
  • Design scalable data pipelines to curate sourced data into the in-house data warehouse.
  • Develop data marts to facilitate dataset consumption by business and IT stakeholders.
  • Propose data model changes that align with in-house data warehouse standards.
  • Define and execute migration activities from legacy databases to the Snowflake-based data warehouse.
  • Implement and manage Snowflake governance (access control, data security, usage monitoring).
  • Support AI use cases through data preparation and integration.
  • Collaborate with cross-functional teams to deliver data-driven solutions.
  • Data Pipeline Development: Design and implement ETL/ELT processes for AI and ML models.
  • Infrastructure Management: Select and manage cloud-based data storage solutions.
  • Model Deployment Support: Prepare datasets and environments for training and deploying AI models.
  • Real-Time Analytics: Handle unstructured data and support real-time data processing.
  • AI-Specific Tools: Work with vector databases, LLM pipelines, and frameworks like TensorFlow or PyTorch.
  • Collaborate with data scientists, ML engineers, and business stakeholders.
  • Ensure data governance, security, and compliance.
  • Monitor and optimise AI model performance.
Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related quantitative discipline.
  • 4–5 years of hands-on experience implementing Snowflake data cloud in a production environment.
  • Strong expertise in ETL/ELT tools and frameworks (e.g., AWS Glue, dbt, Talend, Informatica).
  • Experience with MLOps tools like MLflow, Docker, and LangChain.
  • Proven experience in data warehouse migration and cloud data architecture.
  • Solid understanding of data modelling, SQL, and BI/Analytics concepts.
  • Experience working closely with business teams to deliver data solutions aligned with strategic goals.
  • Familiarity with Snowflake governance best practices.
  • Certified in SnowPro Data Engineer.
  • Deep knowledge of Snowflake performance optimisation, governance, and security.
  • Experience with cloud technologies such as AWS RDS, AWS Fargate, AWS S3 and Azure.
  • Familiarity with PostgreSQL, MS SQL, and other relational databases.
  • Strong programming skills in Python and/or Java.
  • Understanding of LLMs, RAG pipelines, and generative AI deployment.
  • Strong problem-solving and analytical thinking.
  • Excellent communication and stakeholder engagement skills.
  • Ability to work independently and manage multiple priorities.
  • Proactive mindset with a focus on continuous improvement.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.