Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer (AI / ML)

VML

Remote

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A forward-thinking AI company is seeking a Data Engineer (AI / ML) to design and maintain scalable data solutions that support innovative AI tools. The successful candidate will collaborate with data scientists and developers to optimize data workflows, leveraging cloud platforms and ensuring access to reliable data. Responsibilities include designing robust data pipelines, developing APIs for data services, and optimizing data transformations. A commitment to transparency, inclusivity, and personal growth within the team culture are emphasized.

Benefits

Enhanced pension
Private healthcare
27 days holiday plus bank holidays
Annual bonus
Flexible working hours
People-oriented culture
Focus on personal development

Qualifications

  • High proficiency in Python and SQL.
  • Proven hands-on experience building and deploying data solutions on a major cloud platform.
  • Experience in supporting data science workloads with structured and unstructured data.

Responsibilities

  • Collaborate with data scientists and stakeholders to understand business requirements.
  • Leverage cloud-native tools for orchestrating data pipelines.
  • Support development and deployment of AI model infrastructures.

Skills

Python
SQL
Data structures
Database operation
Cloud platforms (AWS, GCP, Azure)
Docker
Kubernetes
Retrieval-Augmented Generation (RAG)
Data pipeline design
Big data processing (e.g., Spark)

Tools

PostgreSQL
MySQL
Pinecone
Weaviate
ChromaDB
Job description
Role: Data Engineer (AI / ML)

Role type: Permanent

Location: UK or Greece

Preferred start date: ASAP

As an organisation, we push the boundaries of data science, optimisation and artificial intelligence to solve the most complex problems in the industry. Satalia, a WPP company, is a community of individuals devoted to working on diverse and challenging projects, allowing you to flex your technical skills whilst working with a tight‑knit team of high performing colleagues.

Led by our founder and WPP Chief AI Officer Daniel Hulme, Satalia's ambition is to become a decentralised organisation of the future. Today, this involves developing tools and processes to liberate and automate manual repetitive tasks, with a focus on freedom, transparency and trust. At the core of our thinking is an approach to wellbeing and inclusivity. We unpack human behaviour and unpick prejudice to ensure a safe and inviting environment. We offer truly flexible working and allow our employees to find the working practice that makes them most productive. At Satalia, your opinion matters and your achievements are celebrated.

THE ROLE

We are investing massively in developing next‑generation AI tools for multimodal datasets and a wide range of applications. We are building large‑scale, enterprise‑grade solutions and serving these innovations to our clients and WPP agency partners. As a member of our team, you will work alongside world‑class talent in an environment that not only fosters innovation but also personal growth. You will be at the forefront of AI, leveraging multimodal datasets to build groundbreaking solutions over a multi‑year roadmap. Your contributions will directly shape cutting‑edge AI products and services that make a tangible impact for FTSE 100 clients.

YOUR RESPONSIBILITIES
  • Collaborate closely with data scientists, architects, and other stakeholders to understand and break down business requirements.
  • Collaborate on schema design, data contracts, and architecture decisions, ensuring alignment with AI/ML needs.
  • Provide data engineering support for AI model development and deployment, ensuring data scientists have access to the data they need in the format they need it.
  • Leverage cloud‑native tools (GCP/AWS/Azure) for orchestrating data pipelines, AI inference workloads, and scalable data services.
  • Develop and maintain APIs for data services and serving model predictions.
  • Support the development, evaluation and productionisation of agentic systems with:
    • LLM‑powered features and prompt engineering
    • Retrieval‑Augmented Generation (RAG) pipelines
    • Multimodal vector embeddings and vector stores
    • Agent development frameworks: ADK, LangGraph, Autogen
    • Model Context Protocol (MCP) for integrating agents with tools, data and AI services
    • Google's Agent2Agent (A2A) protocol for communication and collaboration between different AI agents
  • Implement and optimise data transformations and ETL/ELT processes, using appropriate data engineering tools.
  • Work with a variety of databases and data warehousing solutions to store and retrieve data efficiently.
  • Implement monitoring, troubleshooting, and maintenance procedures for data pipelines to ensure the high quality of data and optimise performance.
  • Participate in the creation and ongoing maintenance of documentation, including data flow diagrams, architecture diagrams, data dictionaries, data catalogues, and process documentation.
MINIMUM QUALIFICATIONS / SKILLS
  • High proficiency in Python and SQL.
  • Strong knowledge of data structures, data modelling, and database operation.
  • Proven hands‑on experience building and deploying data solutions on a major cloud platform (AWS, GCP, or Azure).
  • Familiarity with containerisation technologies such as Docker and Kubernetes.
  • Familiarity with Retrieval‑Augmented Generation (RAG) applications and modern AI/LLM frameworks (e.g., LangChain, Haystack, Google GenAI, etc.).
  • Demonstrable experience designing, implementing, and optimising robust data pipelines for performance, reliability, and cost‑effectiveness in a cloud‑native environment.
  • Experience in supporting data science workloads and working with both structured and unstructured data.
  • Experience working with both relational (e.g., PostgreSQL, MySQL) and NoSQL databases.
  • Experience with a big data processing framework (e.g., Spark).
PREFERRED QUALIFICATIONS / SKILLS
  • API Development: Experience building and deploying scalable and secure API services using a framework like FastAPI, Flask, or similar.
  • Experience partnering with data scientists to automate pipelines for model training, evaluation, and inference, contributing to a robust MLOps cycle.
  • Hands‑on experience designing, building, evaluating, and productionising RAG systems and agentic AI workflows.
  • Hands‑on experience with vector databases (e.g., Pinecone, Weaviate, ChromaDB).
WE OFFER
  • Benefits – enhanced pension, life assurance, income protection, private healthcare;
  • Remote working – café, bedroom, beach – wherever works;
  • Truly flexible working hours – school pick up, volunteering, gym;
  • Generous Leave – 27 days holiday plus bank holidays and enhanced family leave;
  • Annual bonus – when Satalia does well, we all do well;
  • Impactful projects – focus on bringing meaningful social and environmental change;
  • People‑oriented culture – wellbeing is a priority, as is being a nice person;
  • Transparent and open culture – you will be heard;
  • Development – focus on bringing the best out of each other.

Satalia is home to some of the brightest minds in AI and if you're looking to join a company who not only values autonomy and freedom, but embraces a culture of inclusion and warmth, we'd love to hear from you.

We aim to respond to all applications within 2 weeks. If you have not heard from us within 2 weeks this means your application has been unsuccessful.

By applying to Satalia you are expressly giving your consent for the collection and use of your information as described within our Satalia Recruitment Privacy Policy.

Good luck!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.