Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

QAD

Remote

GBP 60,000 - 80,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading software provider is seeking a data engineer to design and optimize data pipelines for AI and data science teams. This fully remote role requires 5+ years of data engineering experience, particularly with Snowflake, AWS services, and advanced SQL and Python skills. The successful candidate will manage data quality and governance while integrating AI/ML models into production workflows. Join a mission-driven team focused on optimizing processes in manufacturing and supply chain management.

Qualifications

  • 5+ years of experience in data engineering with cloud expertise.
  • Must have Snowflake knowledge; skills in optimization and security.
  • Experience in preparing data for AI/ML integration.

Responsibilities

  • Design and maintain scalable data pipelines.
  • Implement multi-source ETL/ELT flows.
  • Prepare data for Data Science teams.

Skills

Data pipeline design
Snowflake expertise
AWS proficiency
SQL
Python
ETL/ELT pipeline design

Tools

Snowflake
AWS S3
AWS IAM
AWS Glue
AWS Lambda
Job description

QAD Inc. is a leading provider of adaptive, cloud-based enterprise software and services for global manufacturing companies. Global manufacturers face ever-increasing disruption caused by technology-driven innovation and changing consumer preferences. In order to survive and thrive, manufacturers must be able to innovate and change business models at unprecedented rates of speed. QAD calls these companies Adaptive Manufacturing Enterprises. QAD solutions help customers in the automotive, life sciences, packaging, consumer products, food and beverage, high tech and industrial manufacturing industries rapidly adapt to change and innovate for competitive advantage.

We are looking for talented individuals who want to join us on our mission to help solve relevant real-world problems in manufacturing and the supply chain.

This role is fully remote in UK, with full work authorization already in effect. No Visa sponsorship is available.

Job Description

In a data-driven and AI-oriented environment, you will be responsible for the design, industrialization, and optimization of inter-application data pipelines. You will be involved in the entire data chain, from data ingestion to its use by data science teams and AI systems in production within a human-sized and multidisciplinary team. This role is within Process Intelligence (PI) team that combines functions such as Process Mining, Real Time Monitoring and Predictive AI.

Key responsibilities:
  • Design and maintain scalable data pipelines.
  • Structure, transform, and optimize data in Snowflake.
  • Implement multi-source ETL/ELT flows (ERP, APIs, files).
  • Leverage the AWS environment, including S3, IAM, and various data services.
  • Prepare data for Data Science teams and integrate AI/ML models into production.
  • Ensure data quality, security, and governance.
  • Provide input on data architecture.
Qualifications
  • 5+ years of experience in data engineering, including significant experience in a cloud environment.
  • Snowflake (MUST HAVE): Expertise in modeling, query optimization, cost management, and security.
  • AWS: Strong knowledge of data and cloud services including S3, IAM, Glue, and Lambda.
  • Languages: Advanced SQL and Python for data manipulation, automation, and ML integration.
  • Data Engineering: Proven experience in ETL/ELT pipeline design.
  • AI/ML Integration: Ability to prepare data for model training and deploy AI models into production workflows (batch or real-time).
Nice to Have:
  • Experience with agentic AI architectures, including agent orchestration and decision loops.
  • Integration of agent-driven AI models into existing data pipelines.
  • Knowledge of modern architectures such as Lakehouse or Data Mesh.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.