Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Experis - ManpowerGroup

Sheffield

Hybrid

GBP 60,000 - 80,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A recruitment agency is seeking a Data Engineer for a hybrid role based in Sheffield. You will design and maintain data pipelines, integrate telemetry into Splunk, and collaborate with various teams to enhance observability capabilities. Required skills include experience with Kafka, OpenShift/Kubernetes, and strong Python data engineering abilities. The role also demands excellent problem-solving and communication skills. This position is essential for driving proactive monitoring and automation within data systems.

Responsibilities

  • Design, implement, and maintain data pipelines for OpenShift telemetry.
  • Stream telemetry via Kafka and build resilient consumer services.
  • Engineer data models and routing for multi-tenant observability.
  • Integrate processed telemetry into Splunk for visualization.
  • Implement schema management, governance, and versioning for telemetry events.
  • Build automated validation and backfill mechanisms for data reliability.
  • Instrument services with OpenTelemetry for logging and tracing.
  • Use LLMs to enhance observability capabilities.
  • Collaborate with teams to integrate telemetry and SLOs.
  • Ensure compliance and best practices for pipelines.
  • Document data flows and operational runbooks.

Skills

Building streaming data pipelines with Kafka
OpenShift/Kubernetes telemetry
Integrating telemetry into Splunk
Data engineering skills in Python
Event schemas knowledge
Familiarity with observability standards
Understanding of hybrid cloud telemetry
Security and compliance for data pipelines
Problem-solving skills
Communication and documentation skills
Job description
Overview

Role Title: Data Engineer
Location: Sheffield| Hybrid - 60% office 40% home
Duration: 30/11/2026
Rate: £550p/d max via Umbrella

Responsibilities
  • Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale.
  • Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment.
  • Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer.
  • Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights).
  • Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events.
  • Build automated validation, replay, and backfill mechanisms for data reliability and recovery.
  • Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms.
  • Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation).
  • Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs.
  • Ensure security, compliance, and best practices for data pipelines and observability platforms.
  • Document data flows, schemas, dashboards, and operational runbooks.
Required Skills
  • Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream).
  • Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling.
  • Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting.
  • Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation.
  • Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility.
  • Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights).
  • Understanding of hybrid cloud and multi-cluster telemetry patterns.
  • Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest.
  • Good problem-solving skills and ability to work in a collaborative team environment.
  • Strong communication and documentation skills
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.