Enable job alerts via email!

Data Engineer

IO Associates

Cheltenham

On-site

GBP 40,000 - 80,000

Full time

13 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is looking for a Data Engineer to join their dynamic team on a contract basis. This role focuses on delivering innovative data solutions to support National Security clients through the design and management of secure data pipelines. You will work with diverse datasets and apply advanced distributed computing techniques to transform raw data into actionable insights. As part of a rapidly growing organization, you'll have the opportunity to make a tangible impact while collaborating with a passionate team. If you're ready to take on complex challenges and help shape the future of data engineering, this role is perfect for you.

Qualifications

  • Proficient in ETL/ELT processes and data technologies.
  • Strong coding skills in Python, Java, or Go required.

Responsibilities

  • Design and manage secure data pipelines for actionable insights.
  • Collaborate with stakeholders to solve complex data challenges.

Skills

ETL/ELT workflows
Apache Kafka
NiFi
Spark
Flink
Airflow
SQL databases
NoSQL databases
Python
Java
Go
Distributed computing

Job description

Role: Data Engineer - Contract - Both Outside/Inside Ir35 positions

Overview: We are seeking a professional Data Engineer to deliver mission-critical technical solutions as part of a contract engagement. This role focuses on building robust data services to support National Security clients. You will collaborate closely with stakeholders to solve complex challenges through innovative data engineering practices.

As a Data Engineer, your primary responsibility will be to design and manage secure data pipelines that transform raw information into actionable insights for National Security purposes. Working with diverse datasets—including batch, streaming, real-time, and unstructured data—you will apply advanced distributed computing techniques to process and analyse large-scale datasets efficiently.

Key Requirements:

  • Ability to create and implement Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows for transferring data from source systems to data stores.
  • Proficiency in one or more data technologies such as Apache Kafka, NiFi, Spark, Flink, or Airflow.
  • Demonstrable experience with SQL and NoSQL databases, including PostgreSQL, MongoDB, Elasticsearch, Accumulo, or Neo4j.
  • Coding expertise in modern programming languages such as Python, Java, or Go.
  • Strong familiarity with distributed computing techniques and scalable data systems.

Company Insight:

With over 60 years of expertise, our client is a leader in advancing technologies across sensors, communications, cybersecurity, artificial intelligence, and data science. They empower organisations with actionable insights derived from multi-layered data analysis, shaping the way they think and act. Their mission is simple: to deliver innovative, technical solutions that ensure safety and security for all.

You’ll be part of a team united by a shared ambition. As a business experiencing rapid growth—having doubled in size over the past four years—they are set to double again by 2027! Every voice is heard, and every idea matters. Together, they break boundaries, reinvest in innovation, and empower individuals to make a tangible difference.

Location/on-site requirements:

Based out of Manchester, a majority of your time spent will be at HQ, or at the customers site which is 15 minutes away. They understand not everyone has easy access to Manchester so there is some flexibility around this. If you're living in London, or further afield in the South West of England then you'd be able to go to other sites closer to your home. Ideally, our client would like you on-site 2-3 days per week given the nature of the work.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.