Enable job alerts via email!

Hadoop Engineer - ODP Platform

Experis - ManpowerGroup

West Midlands Combined Authority

Hybrid

GBP 60,000 - 80,000

Full time

3 days ago
Be an early applicant

Job summary

A leading data consultancy is seeking a skilled Hadoop Engineer to enhance its Operational Data Platform. The role involves designing scalable data pipelines and optimizing workflows using Hadoop technologies. The ideal candidate has at least 5 years of experience in data engineering and strong skills in Python and Apache Airflow. This position offers a hybrid work model based in Birmingham/Sheffield with significant onsite requirements.

Qualifications

  • Minimum 5 years of experience in Hadoop and data engineering.
  • Strong hands-on experience with Python, Airflow, and Spark Streaming.
  • Deep understanding of Hadoop components in on-prem environments.

Responsibilities

  • Design and maintain data pipelines in a Hadoop environment.
  • Build workflows using Apache Airflow for real-time processing.
  • Collaborate with teams to support data use cases.

Skills

Hadoop experience
Python
Apache Airflow
Spark Streaming
Linux systems
Data analytics

Tools

Hadoop
Isolation Manager
CI/CD tools

Job description

Role Title: Hadoop Engineer / ODP Platform
Location: Birmingham / Sheffield - Hybrid working with 3 days onsite per week
End Date: 28/11/2025

Role Overview:
We are seeking a highly skilled Hadoop Engineer to support and enhance our Operational Data Platform (ODP) deployed in an on-premises environment.
The ideal candidate will have extensive experience in the Hadoop ecosystem, strong programming skills, and a solid understanding of infrastructure-level data analytics. This role focuses on building and maintaining scalable, secure, and high-performance data pipelines within enterprise-grade on-prem systems.

Key Responsibilities:

  • Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure.
  • Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing.
  • Develop robust data engineering solutions using Python for automation and transformation.
  • Collaborate with infrastructure and analytics teams to support operational data use cases.
  • Monitor and troubleshoot data jobs, ensuring reliability and performance across the platform.
  • Ensure compliance with enterprise security and data governance standards.


Required Skills & Experience:

  • Minimum 5 years of experience in Hadoop and data engineering.
  • Strong hands-on experience with Python, Apache Airflow, and Spark Streaming.
  • Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments.
  • Exposure to data analytics, preferably involving infrastructure or operational data.
  • Experience working with Linux systems, shell scripting, and enterprise-grade deployment tools.
  • Familiarity with monitoring and logging tools relevant to on-prem setups.


Preferred Qualifications:

  • Experience with enterprise ODP platforms or similar large-scale data systems.
  • Knowledge of configuration management tools (e.g., Ansible, Puppet) and CI/CD in on-prem environments.
  • Understanding of network and storage architecture in data centers.
  • Familiarity with data security, compliance, and audit requirements in regulated industries.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs