Enable job alerts via email!

Data & AI Senior Engineer

OMRON Corporation

Singapore

On-site

SGD 70,000 - 100,000

Full time

4 days ago
Be an early applicant

Job summary

A leading technology firm in Singapore is seeking a skilled Data Engineer to lead Snowflake-based solutions supporting AI and data initiatives. The ideal candidate will have strong ETL/ELT capabilities and experience in migrating data warehouses. This role involves collaborating with business stakeholders and ensuring high-performance architecture aligned with AI workflows.

Qualifications

  • 4–5 years of hands-on experience implementing Snowflake in production.
  • Solid understanding of data modeling, SQL, and BI concepts.
  • Experience with cloud technologies like AWS and Azure.

Responsibilities

  • Design scalable data pipelines for data warehouse.
  • Implement ETL/ELT processes for AI and ML models.
  • Collaborate with cross-functional teams.

Skills

Snowflake Data Cloud implementation
ETL/ELT processes
MLOps
Data governance
Python programming
Communication skills

Education

Bachelor’s or Master’s degree in Computer Science or related field

Tools

AWS Glue
dbt
Talend
MLflow
Docker
LangChain
Job description

Seeking a skilled Data Engineer to lead Snowflake-based solutions supporting AI and Data initiatives. This role drives scalable infrastructure, legacy data warehouse migration, and robust ETL/ELT pipelines.The ideal candidate will have hands-on experience implementing Snowflake Data Cloud, strong ETL/ELT capabilities, and a proven ability to collaborate with business stakeholders. The engineer ensures secure, high-performance architecture aligned with AI workflows, while enforcing governance, quality, and seamless integration across platforms.

Design Responsibilities
  • Analyse internally and externally sourced raw data to generate BI and Advanced Analytics datasets based on stakeholder requirements.
  • Design scalable data pipelines to curate sourced data into the in-house data warehouse.
  • Develop data marts to facilitate dataset consumption by business and IT stakeholders.
  • Propose data model changes that align with in-house data warehouse standards.
  • Define and execute migration activities from legacy databases to the Snowflake-based data warehouse.
  • Implement and manage Snowflake governance (access control, data security, usage monitoring).
  • Support AI use cases through data preparation and integration.
  • Collaborate with cross-functional teams to deliver data-driven solutions.
Engineering Responsibilities
  • Data Pipeline Development: Design and implement ETL/ELT processes for AI and ML models.
  • Infrastructure Management: Select and manage cloud-based data storage solutions.
  • Model Deployment Support: Prepare datasets and environments for training and deploying AI models.
  • Real-Time Analytics: Handle unstructured data and support real-time data processing.
  • AI-Specific Tools: Work with vector databases, LLM pipelines, and frameworks like TensorFlow or PyTorch.
Collaboration & Governance
  • Collaborate with data scientists, ML engineers, and business stakeholders.
  • Ensure data governance, security, and compliance.
  • Monitor and optimise AI model performance.
REQUIREMENTS
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related quantitative discipline.
  • 4–5 years of hands-on experience implementing Snowflake data cloud in a production environment.
  • Strong expertise in ETL/ELT tools and frameworks (e.g., AWS Glue, dbt, Talend, Informatica).
  • Experience with MLOps tools like MLflow, Docker, and LangChain.
  • Proven experience in data warehouse migration and cloud data architecture.
  • Solid understanding of data modelling, SQL, and BI/Analytics concepts.
  • Experience working closely with business teams to deliver data solutions aligned with strategic goals.
  • Familiarity with Snowflake governance best practices.
  • Certified in SnowPro Data Engineer.
  • Deep knowledge of Snowflake performance optimisation, governance, and security.
  • Experience with cloud technologies such as AWS RDS, AWS Fargate, AWS S3 and Azure .
  • Familiarity with PostgreSQL, MS SQL, and other relational databases.
  • Strong programming skills in Python and/or Java.
  • Understanding of LLMs, RAG pipelines, and generative AI deployment.
  • Strong problem-solving and analytical thinking.
  • Excellent communication and stakeholder engagement skills.
  • Ability to work independently and manage multiple priorities.
  • Proactive mindset with a focus on continuous improvement.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.