Enable job alerts via email!

Machine Learning Engineer

Vallum Associates

United Kingdom

Remote

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A technology consulting firm in the UK is seeking an experienced MLOps Data Engineer to design, build, and maintain data pipelines and machine learning infrastructure. This role involves developing robust solutions for ML lifecycle management and requires strong programming skills in Python and SQL, along with proficiency in AWS and data engineering tools. Join a dynamic team to operationalize models and ensure data integrity.

Qualifications

  • Strong programming skills in Python and SQL.
  • Experience with AWS and data engineering tools.
  • Hands-on experience with MLOps frameworks like MLflow and Kubeflow.

Responsibilities

  • Develop and maintain data pipelines for ML and analytics.
  • Implement MLOps best practices for model deployment.
  • Build and optimize ETL/ELT processes for data.

Skills

Python
SQL
AWS
Data engineering tools
MLOps frameworks
Containerization
Data modeling

Tools

Spark
Kafka
Airflow
dbt
MLflow
Kubeflow
Vertex AI
SageMaker
Docker
Kubernetes
Job description

Job description

(Remote)

You will be designing, building and maintaining data pipelines and machine learning infrastructure that support scalable, reliable, and production-ready AI/ML solutions. You will work closely with data scientists, engineers, and product teams to operationalize models, streamline workflows, and ensure data quality and availability.

  • Develop and maintain data pipelines to support machine learning and analytics use cases.

  • Implement MLOps best practices for model deployment, monitoring, and lifecycle management.

  • Build and optimize ETL/ELT processes for structured and unstructured data.

  • Automate workflows for training, testing, and deploying ML models.

  • Ensure data integrity, governance, and security across the ML lifecycle.

MLOps Data Engineer Experience

  • Strong programming skills in Python, SQL, and experience with AWS

  • Proficiency with data engineering tools (e.g., Spark, Kafka, Airflow, dbt).

  • Hands-on experience with MLOps frameworks (e.g., MLflow, Kubeflow, Vertex AI, SageMaker).

  • Familiarity with CI/CD pipelines, containerization (Docker, Kubernetes)

  • Solid understanding of data modeling, warehousing, and APIs.

  • Strong problem-solving skills and ability to work in agile environments.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.